8.0 KiB
Development guide
Prerequisites
- Python 3.12+
- uv (package manager)
- Docker and Docker Compose (for dependencies and testing)
- Git
Initial setup
1. Clone and install
git clone <repo-url> proxy-pool
cd proxy-pool
# Install all dependencies (including dev) in a virtual env
uv sync
# Verify installation
uv run python -c "import proxy_pool; print('OK')"
uv sync creates a .venv/ in the project root, installs all dependencies from uv.lock, and installs the proxy_pool package in editable mode (thanks to the src/ layout and pyproject.toml build config).
2. Start infrastructure
# Start PostgreSQL and Redis
docker compose up -d postgres redis
# Verify they're running
docker compose ps
3. Configure environment
cp .env.example .env
# Edit .env with your local settings
Key variables:
DATABASE_URL=postgresql+asyncpg://proxypool:proxypool@localhost:5432/proxypool
REDIS_URL=redis://localhost:6379/0
SECRET_KEY=your-random-secret-for-dev
LOG_LEVEL=DEBUG
# Optional: SMTP for notifier plugin testing
SMTP_HOST=
SMTP_PORT=587
SMTP_USER=
SMTP_PASSWORD=
ALERT_EMAIL=
4. Run migrations
uv run alembic upgrade head
5. Start the application
# API server (with hot reload)
uv run uvicorn proxy_pool.app:create_app --factory --reload --port 8000
# In a separate terminal: ARQ worker
uv run arq proxy_pool.worker.settings.WorkerSettings
The API is now available at http://localhost:8000. OpenAPI docs are at http://localhost:8000/docs.
Project layout
proxy-pool/
├── src/proxy_pool/ # Application source code
│ ├── app.py # App factory + lifespan
│ ├── config.py # Settings (env-driven)
│ ├── common/ # Shared utilities
│ ├── db/ # Database infrastructure
│ ├── proxy/ # Proxy domain module
│ ├── accounts/ # Accounts domain module
│ ├── plugins/ # Plugin system + built-in plugins
│ └── worker/ # ARQ task definitions
├── tests/ # Test suite
├── alembic/ # Migration files
├── docs/ # This documentation
└── pyproject.toml # Project config (uv, ruff, mypy, pytest)
See 01-architecture.md for detailed structure and rationale.
Working with the database
Creating a migration
# Auto-generate from model changes
uv run alembic revision --autogenerate -m "add proxy_tags table"
# Review the generated migration!
cat alembic/versions/NNN_add_proxy_tags_table.py
# Apply it
uv run alembic upgrade head
Always review autogenerated migrations. Alembic can miss custom indexes, enum type changes, and data migrations. Common things to verify:
- Enum types are created/altered correctly.
- Index names match the naming convention.
downgrade()reverses the change completely.- No data is dropped unintentionally.
Useful Alembic commands
# Show current revision
uv run alembic current
# Show migration history
uv run alembic history --verbose
# Downgrade one step
uv run alembic downgrade -1
# Downgrade to a specific revision
uv run alembic downgrade abc123
# Generate a blank migration (for data migrations)
uv run alembic revision -m "backfill proxy scores"
Database shell
# Via Docker
docker compose exec postgres psql -U proxypool
# Or directly
psql postgresql://proxypool:proxypool@localhost:5432/proxypool
Running tests
Quick: unit tests only (no Docker needed)
uv run pytest tests/unit/ -x -v
Full: integration tests with Docker dependencies
# Start test infrastructure
docker compose -f docker-compose.yml -f docker-compose.test.yml up -d postgres redis
# Run all tests
uv run pytest tests/ -x -v --timeout=30
# Or run via Docker (how CI does it)
docker compose -f docker-compose.yml -f docker-compose.test.yml run --rm test
Test organization
tests/unit/— No I/O. All external dependencies are mocked. Fast.tests/integration/— Uses real PostgreSQL and Redis via Docker. Tests full request flows, database queries, and cache behavior.tests/plugins/— Plugin-specific tests. Most are unit tests, but some (like SMTP notifier) may use integration fixtures.
Key fixtures (in conftest.py)
@pytest.fixture
async def db_session():
"""Provides an async SQLAlchemy session rolled back after each test."""
@pytest.fixture
async def redis():
"""Provides a Redis connection flushed after each test."""
@pytest.fixture
async def client(db_session, redis):
"""Provides an httpx.AsyncClient wired to a test app instance."""
@pytest.fixture
def registry():
"""Provides a PluginRegistry with built-in plugins loaded."""
Writing a test
# tests/unit/test_scoring.py
from proxy_pool.proxy.service import compute_proxy_score
def test_score_weights_latency():
checks = [make_check(passed=True, latency_ms=100)]
score = compute_proxy_score(make_proxy(), checks, make_context())
assert 0.8 < score <= 1.0
def test_dead_proxy_gets_zero_score():
checks = [make_check(passed=False)]
score = compute_proxy_score(make_proxy(), checks, make_context())
assert score == 0.0
# tests/integration/test_acquire_flow.py
async def test_acquire_deducts_credit(client, db_session):
user = await create_user_with_credits(db_session, credits=10)
await create_active_proxy(db_session)
response = await client.post(
"/proxies/acquire",
headers={"Authorization": f"Bearer {user.api_key}"},
json={"protocol": "http"},
)
assert response.status_code == 200
assert response.json()["credits_remaining"] == 9
Code quality
Linting and formatting
# Check
uv run ruff check src/ tests/
uv run ruff format --check src/ tests/
# Fix
uv run ruff check --fix src/ tests/
uv run ruff format src/ tests/
Type checking
uv run mypy src/
mypy is configured with strict = true in pyproject.toml. The pydantic.mypy plugin is enabled for correct Pydantic model inference.
Pre-commit (optional)
If you want automated checks on every commit:
uv tool install pre-commit
pre-commit install
Docker workflow
Build the image
docker compose build
Run the full stack
# Run migrations + start API + worker
docker compose --profile migrate up -d migrate
docker compose up -d api worker
View logs
docker compose logs -f api worker
Rebuild after code changes
docker compose build api
docker compose up -d api worker
Shell into a running container
docker compose exec api bash
docker compose exec postgres psql -U proxypool
docker compose exec redis redis-cli
Adding a new plugin
- Create a file in
src/proxy_pool/plugins/builtin/<type>/your_plugin.py. - Implement the relevant Protocol (see
02-plugin-system.md). - Define
create_plugin(settings: Settings) -> YourPlugin | None. - Add tests in
tests/plugins/test_your_plugin.py. - Restart the app — the plugin is auto-discovered.
For third-party plugins, place files in plugins/contrib/ (or mount a directory at /app/plugins-contrib in Docker).
Common development tasks
Add a new API endpoint
- Define Pydantic schemas in
<domain>/schemas.py. - Add business logic in
<domain>/service.py. - Create the route in
<domain>/router.py. - Register the router in
app.pyif it's a new router. - Add tests.
Add a new database table
- Define the SQLAlchemy model in
<domain>/models.py. - Import the model in
db/base.py(so Alembic sees it). - Generate a migration:
uv run alembic revision --autogenerate -m "description". - Review and apply:
uv run alembic upgrade head. - Add tests.
Add a new background task
- Define the task function in
worker/tasks_<category>.py. - Register it in
worker/settings.py(add tofunctionslist, andcron_jobsif periodic). - Restart the worker.
- Add tests.