334 lines
8.0 KiB
Markdown
334 lines
8.0 KiB
Markdown
# Development guide
|
|
|
|
## Prerequisites
|
|
|
|
- Python 3.12+
|
|
- [uv](https://docs.astral.sh/uv/) (package manager)
|
|
- Docker and Docker Compose (for dependencies and testing)
|
|
- Git
|
|
|
|
## Initial setup
|
|
|
|
### 1. Clone and install
|
|
|
|
```bash
|
|
git clone <repo-url> proxy-pool
|
|
cd proxy-pool
|
|
|
|
# Install all dependencies (including dev) in a virtual env
|
|
uv sync
|
|
|
|
# Verify installation
|
|
uv run python -c "import proxy_pool; print('OK')"
|
|
```
|
|
|
|
`uv sync` creates a `.venv/` in the project root, installs all dependencies from `uv.lock`, and installs the `proxy_pool` package in editable mode (thanks to the `src/` layout and `pyproject.toml` build config).
|
|
|
|
### 2. Start infrastructure
|
|
|
|
```bash
|
|
# Start PostgreSQL and Redis
|
|
docker compose up -d postgres redis
|
|
|
|
# Verify they're running
|
|
docker compose ps
|
|
```
|
|
|
|
### 3. Configure environment
|
|
|
|
```bash
|
|
cp .env.example .env
|
|
# Edit .env with your local settings
|
|
```
|
|
|
|
Key variables:
|
|
|
|
```env
|
|
DATABASE_URL=postgresql+asyncpg://proxypool:proxypool@localhost:5432/proxypool
|
|
REDIS_URL=redis://localhost:6379/0
|
|
SECRET_KEY=your-random-secret-for-dev
|
|
LOG_LEVEL=DEBUG
|
|
|
|
# Optional: SMTP for notifier plugin testing
|
|
SMTP_HOST=
|
|
SMTP_PORT=587
|
|
SMTP_USER=
|
|
SMTP_PASSWORD=
|
|
ALERT_EMAIL=
|
|
```
|
|
|
|
### 4. Run migrations
|
|
|
|
```bash
|
|
uv run alembic upgrade head
|
|
```
|
|
|
|
### 5. Start the application
|
|
|
|
```bash
|
|
# API server (with hot reload)
|
|
uv run uvicorn proxy_pool.app:create_app --factory --reload --port 8000
|
|
|
|
# In a separate terminal: ARQ worker
|
|
uv run arq proxy_pool.worker.settings.WorkerSettings
|
|
```
|
|
|
|
The API is now available at `http://localhost:8000`. OpenAPI docs are at `http://localhost:8000/docs`.
|
|
|
|
## Project layout
|
|
|
|
```
|
|
proxy-pool/
|
|
├── src/proxy_pool/ # Application source code
|
|
│ ├── app.py # App factory + lifespan
|
|
│ ├── config.py # Settings (env-driven)
|
|
│ ├── common/ # Shared utilities
|
|
│ ├── db/ # Database infrastructure
|
|
│ ├── proxy/ # Proxy domain module
|
|
│ ├── accounts/ # Accounts domain module
|
|
│ ├── plugins/ # Plugin system + built-in plugins
|
|
│ └── worker/ # ARQ task definitions
|
|
├── tests/ # Test suite
|
|
├── alembic/ # Migration files
|
|
├── docs/ # This documentation
|
|
└── pyproject.toml # Project config (uv, ruff, mypy, pytest)
|
|
```
|
|
|
|
See `01-architecture.md` for detailed structure and rationale.
|
|
|
|
## Working with the database
|
|
|
|
### Creating a migration
|
|
|
|
```bash
|
|
# Auto-generate from model changes
|
|
uv run alembic revision --autogenerate -m "add proxy_tags table"
|
|
|
|
# Review the generated migration!
|
|
cat alembic/versions/NNN_add_proxy_tags_table.py
|
|
|
|
# Apply it
|
|
uv run alembic upgrade head
|
|
```
|
|
|
|
Always review autogenerated migrations. Alembic can miss custom indexes, enum type changes, and data migrations. Common things to verify:
|
|
|
|
- Enum types are created/altered correctly.
|
|
- Index names match the naming convention.
|
|
- `downgrade()` reverses the change completely.
|
|
- No data is dropped unintentionally.
|
|
|
|
### Useful Alembic commands
|
|
|
|
```bash
|
|
# Show current revision
|
|
uv run alembic current
|
|
|
|
# Show migration history
|
|
uv run alembic history --verbose
|
|
|
|
# Downgrade one step
|
|
uv run alembic downgrade -1
|
|
|
|
# Downgrade to a specific revision
|
|
uv run alembic downgrade abc123
|
|
|
|
# Generate a blank migration (for data migrations)
|
|
uv run alembic revision -m "backfill proxy scores"
|
|
```
|
|
|
|
### Database shell
|
|
|
|
```bash
|
|
# Via Docker
|
|
docker compose exec postgres psql -U proxypool
|
|
|
|
# Or directly
|
|
psql postgresql://proxypool:proxypool@localhost:5432/proxypool
|
|
```
|
|
|
|
## Running tests
|
|
|
|
### Quick: unit tests only (no Docker needed)
|
|
|
|
```bash
|
|
uv run pytest tests/unit/ -x -v
|
|
```
|
|
|
|
### Full: integration tests with Docker dependencies
|
|
|
|
```bash
|
|
# Start test infrastructure
|
|
docker compose -f docker-compose.yml -f docker-compose.test.yml up -d postgres redis
|
|
|
|
# Run all tests
|
|
uv run pytest tests/ -x -v --timeout=30
|
|
|
|
# Or run via Docker (how CI does it)
|
|
docker compose -f docker-compose.yml -f docker-compose.test.yml run --rm test
|
|
```
|
|
|
|
### Test organization
|
|
|
|
- `tests/unit/` — No I/O. All external dependencies are mocked. Fast.
|
|
- `tests/integration/` — Uses real PostgreSQL and Redis via Docker. Tests full request flows, database queries, and cache behavior.
|
|
- `tests/plugins/` — Plugin-specific tests. Most are unit tests, but some (like SMTP notifier) may use integration fixtures.
|
|
|
|
### Key fixtures (in `conftest.py`)
|
|
|
|
```python
|
|
@pytest.fixture
|
|
async def db_session():
|
|
"""Provides an async SQLAlchemy session rolled back after each test."""
|
|
|
|
@pytest.fixture
|
|
async def redis():
|
|
"""Provides a Redis connection flushed after each test."""
|
|
|
|
@pytest.fixture
|
|
async def client(db_session, redis):
|
|
"""Provides an httpx.AsyncClient wired to a test app instance."""
|
|
|
|
@pytest.fixture
|
|
def registry():
|
|
"""Provides a PluginRegistry with built-in plugins loaded."""
|
|
```
|
|
|
|
### Writing a test
|
|
|
|
```python
|
|
# tests/unit/test_scoring.py
|
|
|
|
from proxy_pool.proxy.service import compute_proxy_score
|
|
|
|
def test_score_weights_latency():
|
|
checks = [make_check(passed=True, latency_ms=100)]
|
|
score = compute_proxy_score(make_proxy(), checks, make_context())
|
|
assert 0.8 < score <= 1.0
|
|
|
|
def test_dead_proxy_gets_zero_score():
|
|
checks = [make_check(passed=False)]
|
|
score = compute_proxy_score(make_proxy(), checks, make_context())
|
|
assert score == 0.0
|
|
```
|
|
|
|
```python
|
|
# tests/integration/test_acquire_flow.py
|
|
|
|
async def test_acquire_deducts_credit(client, db_session):
|
|
user = await create_user_with_credits(db_session, credits=10)
|
|
await create_active_proxy(db_session)
|
|
|
|
response = await client.post(
|
|
"/proxies/acquire",
|
|
headers={"Authorization": f"Bearer {user.api_key}"},
|
|
json={"protocol": "http"},
|
|
)
|
|
|
|
assert response.status_code == 200
|
|
assert response.json()["credits_remaining"] == 9
|
|
```
|
|
|
|
## Code quality
|
|
|
|
### Linting and formatting
|
|
|
|
```bash
|
|
# Check
|
|
uv run ruff check src/ tests/
|
|
uv run ruff format --check src/ tests/
|
|
|
|
# Fix
|
|
uv run ruff check --fix src/ tests/
|
|
uv run ruff format src/ tests/
|
|
```
|
|
|
|
### Type checking
|
|
|
|
```bash
|
|
uv run mypy src/
|
|
```
|
|
|
|
`mypy` is configured with `strict = true` in `pyproject.toml`. The `pydantic.mypy` plugin is enabled for correct Pydantic model inference.
|
|
|
|
### Pre-commit (optional)
|
|
|
|
If you want automated checks on every commit:
|
|
|
|
```bash
|
|
uv tool install pre-commit
|
|
pre-commit install
|
|
```
|
|
|
|
## Docker workflow
|
|
|
|
### Build the image
|
|
|
|
```bash
|
|
docker compose build
|
|
```
|
|
|
|
### Run the full stack
|
|
|
|
```bash
|
|
# Run migrations + start API + worker
|
|
docker compose --profile migrate up -d migrate
|
|
docker compose up -d api worker
|
|
```
|
|
|
|
### View logs
|
|
|
|
```bash
|
|
docker compose logs -f api worker
|
|
```
|
|
|
|
### Rebuild after code changes
|
|
|
|
```bash
|
|
docker compose build api
|
|
docker compose up -d api worker
|
|
```
|
|
|
|
### Shell into a running container
|
|
|
|
```bash
|
|
docker compose exec api bash
|
|
docker compose exec postgres psql -U proxypool
|
|
docker compose exec redis redis-cli
|
|
```
|
|
|
|
## Adding a new plugin
|
|
|
|
1. Create a file in `src/proxy_pool/plugins/builtin/<type>/your_plugin.py`.
|
|
2. Implement the relevant Protocol (see `02-plugin-system.md`).
|
|
3. Define `create_plugin(settings: Settings) -> YourPlugin | None`.
|
|
4. Add tests in `tests/plugins/test_your_plugin.py`.
|
|
5. Restart the app — the plugin is auto-discovered.
|
|
|
|
For third-party plugins, place files in `plugins/contrib/` (or mount a directory at `/app/plugins-contrib` in Docker).
|
|
|
|
## Common development tasks
|
|
|
|
### Add a new API endpoint
|
|
|
|
1. Define Pydantic schemas in `<domain>/schemas.py`.
|
|
2. Add business logic in `<domain>/service.py`.
|
|
3. Create the route in `<domain>/router.py`.
|
|
4. Register the router in `app.py` if it's a new router.
|
|
5. Add tests.
|
|
|
|
### Add a new database table
|
|
|
|
1. Define the SQLAlchemy model in `<domain>/models.py`.
|
|
2. Import the model in `db/base.py` (so Alembic sees it).
|
|
3. Generate a migration: `uv run alembic revision --autogenerate -m "description"`.
|
|
4. Review and apply: `uv run alembic upgrade head`.
|
|
5. Add tests.
|
|
|
|
### Add a new background task
|
|
|
|
1. Define the task function in `worker/tasks_<category>.py`.
|
|
2. Register it in `worker/settings.py` (add to `functions` list, and `cron_jobs` if periodic).
|
|
3. Restart the worker.
|
|
4. Add tests.
|