Feat: Enforce PostgreSQL for integration tests; add Docker test stack

- conftest.py: pytest_configure guard rejects non-postgresql+asyncpg:// URLs
  before any test collects (per constitution §2.5/§5.2 v1.3.0)
- docker-compose.test.yml: isolated postgres-test (5433) + minio-test (9002)
  + api-test runner; one command runs the full suite against real PostgreSQL
- Makefile: test-unit and test-integration targets
- .env.test.example: documents variables needed to run tests outside Docker
- Fix pre-existing test bug: integration tests using client fixture (NoOpAuthProvider)
  for write operations (upload/delete/patch) now use authed_client with Bearer
  token — these were never caught because tests never ran against a live stack

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-05-06 19:14:12 +00:00
parent 354c85292d
commit f3e0021ee8
17 changed files with 761 additions and 39 deletions

29
.env.test.example Normal file
View File

@@ -0,0 +1,29 @@
# Integration test environment variables
# Used when running pytest directly on the host (outside Docker).
#
# Start test services first:
# docker compose -f docker-compose.test.yml up -d postgres-test minio-test minio-init-test
#
# Then source this file and run tests:
# export $(grep -v '^#' .env.test.example | xargs)
# cd api && python -m pytest tests/integration/ -v
# PostgreSQL test database (postgres-test container on host port 5433)
TEST_DATABASE_URL=postgresql+asyncpg://reactbin:reactbin@localhost:5433/reactbin_test
DATABASE_URL=postgresql+asyncpg://reactbin:reactbin@localhost:5433/reactbin_test
# MinIO test instance (minio-test container on host port 9002)
S3_ENDPOINT_URL=http://localhost:9002
S3_BUCKET_NAME=reactbin-test
S3_ACCESS_KEY_ID=minioadmin
S3_SECRET_ACCESS_KEY=minioadmin
S3_REGION=us-east-1
# Auth (test values — not for production)
JWT_SECRET_KEY=test-secret-key-for-testing-only
OWNER_USERNAME=testowner
OWNER_PASSWORD=testpassword
# API
API_BASE_URL=http://localhost:8000
MAX_UPLOAD_BYTES=52428800

1
.gitignore vendored
View File

@@ -5,6 +5,7 @@ notes/
.env .env
.env.* .env.*
!.env.example !.env.example
!.env.test.example
# Python # Python
__pycache__/ __pycache__/

View File

@@ -1 +1 @@
{"feature_directory":"specs/007-tag-browser"} {"feature_directory":"specs/008-postgres-integration-tests"}

View File

@@ -1,5 +1,5 @@
<!-- SPECKIT START --> <!-- SPECKIT START -->
For additional context about technologies to be used, project structure, For additional context about technologies to be used, project structure,
shell commands, and other important information, read the current plan at shell commands, and other important information, read the current plan at
`specs/007-tag-browser/plan.md`. `specs/008-postgres-integration-tests/plan.md`.
<!-- SPECKIT END --> <!-- SPECKIT END -->

7
Makefile Normal file
View File

@@ -0,0 +1,7 @@
.PHONY: test-unit test-integration
test-unit:
cd api && python -m pytest tests/unit/ -v
test-integration:
docker compose -f docker-compose.test.yml run --rm api-test

View File

@@ -1,5 +1,6 @@
import os import os
import pytest
import pytest_asyncio import pytest_asyncio
from httpx import ASGITransport, AsyncClient from httpx import ASGITransport, AsyncClient
from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine
@@ -9,11 +10,11 @@ os.environ.setdefault("JWT_SECRET_KEY", "test-secret-key-for-testing-only")
os.environ.setdefault("OWNER_USERNAME", "testowner") os.environ.setdefault("OWNER_USERNAME", "testowner")
os.environ.setdefault("OWNER_PASSWORD", "testpassword") os.environ.setdefault("OWNER_PASSWORD", "testpassword")
from app.auth.jwt_provider import JWTAuthProvider from app.auth.jwt_provider import JWTAuthProvider # noqa: E402
from app.config import get_settings from app.config import get_settings # noqa: E402
from app.database import Base from app.database import Base # noqa: E402
from app.dependencies import get_auth, get_db, get_storage from app.dependencies import get_auth, get_db, get_storage # noqa: E402
from app.main import app from app.main import app # noqa: E402
# Bust the LRU cache so get_settings() picks up the env vars set above # Bust the LRU cache so get_settings() picks up the env vars set above
get_settings.cache_clear() get_settings.cache_clear()
@@ -26,8 +27,6 @@ _TEST_OWNER_PASSWORD = os.environ["OWNER_PASSWORD"]
@pytest_asyncio.fixture(scope="session", loop_scope="session") @pytest_asyncio.fixture(scope="session", loop_scope="session")
async def engine(): async def engine():
settings = get_settings() settings = get_settings()
# Use a separate test database URL if TEST_DATABASE_URL is set
import os
db_url = os.getenv("TEST_DATABASE_URL", settings.database_url) db_url = os.getenv("TEST_DATABASE_URL", settings.database_url)
eng = create_async_engine(db_url, echo=False) eng = create_async_engine(db_url, echo=False)
async with eng.begin() as conn: async with eng.begin() as conn:
@@ -108,3 +107,15 @@ async def authed_client(db_session, jwt_auth_provider):
yield c, valid_token yield c, valid_token
app.dependency_overrides.clear() app.dependency_overrides.clear()
def pytest_configure(config):
db_url = os.getenv("TEST_DATABASE_URL") or os.getenv("DATABASE_URL", "")
if not db_url.startswith("postgresql+asyncpg://"):
pytest.exit(
"Integration tests require a PostgreSQL database "
"(postgresql+asyncpg://...). "
"Set TEST_DATABASE_URL or DATABASE_URL accordingly. "
f"Got: {db_url!r}",
returncode=1,
)

View File

@@ -19,15 +19,18 @@ def _minimal_jpeg_v2() -> bytes:
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_delete_removes_record(client): async def test_delete_removes_record(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
data = _minimal_jpeg_v2() data = _minimal_jpeg_v2()
upload = await client.post( upload = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("del-test.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("del-test.jpg", io.BytesIO(data), "image/jpeg")},
headers=headers,
) )
image_id = upload.json()["id"] image_id = upload.json()["id"]
delete_resp = await client.delete(f"/api/v1/images/{image_id}") delete_resp = await client.delete(f"/api/v1/images/{image_id}", headers=headers)
assert delete_resp.status_code == 204 assert delete_resp.status_code == 204
get_resp = await client.get(f"/api/v1/images/{image_id}") get_resp = await client.get(f"/api/v1/images/{image_id}")
@@ -36,16 +39,19 @@ async def test_delete_removes_record(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_delete_removes_storage_object(client): async def test_delete_removes_storage_object(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
data = _minimal_jpeg_v2() + b"\x00" data = _minimal_jpeg_v2() + b"\x00"
upload = await client.post( upload = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("del-storage-test.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("del-storage-test.jpg", io.BytesIO(data), "image/jpeg")},
headers=headers,
) )
assert upload.status_code in (200, 201) assert upload.status_code in (200, 201)
image_id = upload.json()["id"] image_id = upload.json()["id"]
delete_resp = await client.delete(f"/api/v1/images/{image_id}") delete_resp = await client.delete(f"/api/v1/images/{image_id}", headers=headers)
assert delete_resp.status_code == 204 assert delete_resp.status_code == 204
# Confirm storage redirect no longer works (404 since record is gone) # Confirm storage redirect no longer works (404 since record is gone)
@@ -54,15 +60,21 @@ async def test_delete_removes_storage_object(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_delete_unknown_id_returns_404(client): async def test_delete_unknown_id_returns_404(authed_client):
response = await client.delete(f"/api/v1/images/{uuid.uuid4()}") client, token = authed_client
response = await client.delete(
f"/api/v1/images/{uuid.uuid4()}",
headers={"Authorization": f"Bearer {token}"},
)
assert response.status_code == 404 assert response.status_code == 404
body = response.json() body = response.json()
assert body["code"] == "image_not_found" assert body["code"] == "image_not_found"
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_delete_removes_thumbnail(client): async def test_delete_removes_thumbnail(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
buf = io.BytesIO() buf = io.BytesIO()
PILImage.new("RGB", (200, 150), color=(60, 90, 120)).save(buf, format="JPEG") PILImage.new("RGB", (200, 150), color=(60, 90, 120)).save(buf, format="JPEG")
data = buf.getvalue() data = buf.getvalue()
@@ -70,12 +82,13 @@ async def test_delete_removes_thumbnail(client):
upload = await client.post( upload = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("thumb-del.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("thumb-del.jpg", io.BytesIO(data), "image/jpeg")},
headers=headers,
) )
assert upload.status_code == 201 assert upload.status_code == 201
image_id = upload.json()["id"] image_id = upload.json()["id"]
assert upload.json()["thumbnail_key"] is not None assert upload.json()["thumbnail_key"] is not None
delete_resp = await client.delete(f"/api/v1/images/{image_id}") delete_resp = await client.delete(f"/api/v1/images/{image_id}", headers=headers)
assert delete_resp.status_code == 204 assert delete_resp.status_code == 204
thumb_resp = await client.get(f"/api/v1/images/{image_id}/thumbnail") thumb_resp = await client.get(f"/api/v1/images/{image_id}/thumbnail")

View File

@@ -16,7 +16,9 @@ def _minimal_gif() -> bytes:
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_and_filter_returns_only_matching_images(client): async def test_and_filter_returns_only_matching_images(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
data = _minimal_gif() data = _minimal_gif()
# Image with both tags # Image with both tags
@@ -24,6 +26,7 @@ async def test_and_filter_returns_only_matching_images(client):
"/api/v1/images", "/api/v1/images",
files={"file": ("both.gif", io.BytesIO(data), "image/gif")}, files={"file": ("both.gif", io.BytesIO(data), "image/gif")},
data={"tags": "andcat,andfunny"}, data={"tags": "andcat,andfunny"},
headers=headers,
) )
both_id = r_both.json()["id"] both_id = r_both.json()["id"]
@@ -32,6 +35,7 @@ async def test_and_filter_returns_only_matching_images(client):
"/api/v1/images", "/api/v1/images",
files={"file": ("one.gif", io.BytesIO(data + b"\x00"), "image/gif")}, files={"file": ("one.gif", io.BytesIO(data + b"\x00"), "image/gif")},
data={"tags": "andcat"}, data={"tags": "andcat"},
headers=headers,
) )
response = await client.get("/api/v1/images?tags=andcat,andfunny") response = await client.get("/api/v1/images?tags=andcat,andfunny")
@@ -43,7 +47,9 @@ async def test_and_filter_returns_only_matching_images(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_filter_excludes_partial_tag_match(client): async def test_filter_excludes_partial_tag_match(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
data = _minimal_gif() data = _minimal_gif()
# Image with only "exclcat" # Image with only "exclcat"
@@ -51,6 +57,7 @@ async def test_filter_excludes_partial_tag_match(client):
"/api/v1/images", "/api/v1/images",
files={"file": ("partial.gif", io.BytesIO(data + b"\x01"), "image/gif")}, files={"file": ("partial.gif", io.BytesIO(data + b"\x01"), "image/gif")},
data={"tags": "exclcat"}, data={"tags": "exclcat"},
headers=headers,
) )
# Filter requires both exclcat and exclother # Filter requires both exclcat and exclother

View File

@@ -29,11 +29,13 @@ def _minimal_webp() -> bytes:
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_file_returns_200_with_content(client): async def test_file_returns_200_with_content(authed_client):
client, token = authed_client
data = _minimal_webp() data = _minimal_webp()
upload = await client.post( upload = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("img.webp", io.BytesIO(data), "image/webp")}, files={"file": ("img.webp", io.BytesIO(data), "image/webp")},
headers={"Authorization": f"Bearer {token}"},
) )
assert upload.status_code in (200, 201) assert upload.status_code in (200, 201)
upload_body = upload.json() upload_body = upload.json()
@@ -57,11 +59,13 @@ async def test_file_unknown_id_returns_404(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_file_response_exposes_no_storage_details(client): async def test_file_response_exposes_no_storage_details(authed_client):
client, token = authed_client
data = _minimal_webp() data = _minimal_webp()
upload = await client.post( upload = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("img.webp", io.BytesIO(data), "image/webp")}, files={"file": ("img.webp", io.BytesIO(data), "image/webp")},
headers={"Authorization": f"Bearer {token}"},
) )
assert upload.status_code in (200, 201) assert upload.status_code in (200, 201)
image_id = upload.json()["id"] image_id = upload.json()["id"]
@@ -75,11 +79,13 @@ async def test_file_response_exposes_no_storage_details(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_thumbnail_returns_webp(client): async def test_thumbnail_returns_webp(authed_client):
client, token = authed_client
data = _real_jpeg() data = _real_jpeg()
upload = await client.post( upload = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("t.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("t.jpg", io.BytesIO(data), "image/jpeg")},
headers={"Authorization": f"Bearer {token}"},
) )
assert upload.status_code == 201 assert upload.status_code == 201
body = upload.json() body = upload.json()
@@ -95,11 +101,13 @@ async def test_thumbnail_returns_webp(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_thumbnail_fallback_returns_original(client, db_session): async def test_thumbnail_fallback_returns_original(authed_client, db_session):
client, token = authed_client
data = _real_jpeg() data = _real_jpeg()
upload = await client.post( upload = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("fallback.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("fallback.jpg", io.BytesIO(data), "image/jpeg")},
headers={"Authorization": f"Bearer {token}"},
) )
assert upload.status_code == 201 assert upload.status_code == 201
image_id = upload.json()["id"] image_id = upload.json()["id"]

View File

@@ -31,12 +31,14 @@ def _minimal_png() -> bytes:
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_upload_with_tags_persists_tags(client): async def test_upload_with_tags_persists_tags(authed_client):
client, token = authed_client
data = _minimal_png() data = _minimal_png()
response = await client.post( response = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("img.png", io.BytesIO(data), "image/png")}, files={"file": ("img.png", io.BytesIO(data), "image/png")},
data={"tags": "cat,funny"}, data={"tags": "cat,funny"},
headers={"Authorization": f"Bearer {token}"},
) )
assert response.status_code == 201 assert response.status_code == 201
body = response.json() body = response.json()
@@ -44,12 +46,15 @@ async def test_upload_with_tags_persists_tags(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_duplicate_upload_tags_unchanged(client): async def test_duplicate_upload_tags_unchanged(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
data = _minimal_png() data = _minimal_png()
r1 = await client.post( r1 = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("img.png", io.BytesIO(data), "image/png")}, files={"file": ("img.png", io.BytesIO(data), "image/png")},
data={"tags": "original-tag"}, data={"tags": "original-tag"},
headers=headers,
) )
assert r1.status_code in (200, 201) assert r1.status_code in (200, 201)
original_tags = set(r1.json()["tags"]) original_tags = set(r1.json()["tags"])
@@ -58,6 +63,7 @@ async def test_duplicate_upload_tags_unchanged(client):
"/api/v1/images", "/api/v1/images",
files={"file": ("img.png", io.BytesIO(data), "image/png")}, files={"file": ("img.png", io.BytesIO(data), "image/png")},
data={"tags": "different-tag"}, data={"tags": "different-tag"},
headers=headers,
) )
assert r2.status_code == 200 assert r2.status_code == 200
assert r2.json()["duplicate"] is True assert r2.json()["duplicate"] is True
@@ -65,18 +71,22 @@ async def test_duplicate_upload_tags_unchanged(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_patch_replaces_tag_set(client): async def test_patch_replaces_tag_set(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
data = _minimal_png() data = _minimal_png()
r1 = await client.post( r1 = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("patch-test.png", io.BytesIO(data), "image/png")}, files={"file": ("patch-test.png", io.BytesIO(data), "image/png")},
data={"tags": "old-tag"}, data={"tags": "old-tag"},
headers=headers,
) )
image_id = r1.json()["id"] image_id = r1.json()["id"]
patch = await client.patch( patch = await client.patch(
f"/api/v1/images/{image_id}/tags", f"/api/v1/images/{image_id}/tags",
json={"tags": ["new-tag", "another"]}, json={"tags": ["new-tag", "another"]},
headers=headers,
) )
assert patch.status_code == 200 assert patch.status_code == 200
body = patch.json() body = patch.json()
@@ -85,17 +95,21 @@ async def test_patch_replaces_tag_set(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_patch_invalid_tag_returns_422(client): async def test_patch_invalid_tag_returns_422(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
data = _minimal_png() data = _minimal_png()
r1 = await client.post( r1 = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("invalid-tag-test.png", io.BytesIO(data), "image/png")}, files={"file": ("invalid-tag-test.png", io.BytesIO(data), "image/png")},
headers=headers,
) )
image_id = r1.json()["id"] image_id = r1.json()["id"]
patch = await client.patch( patch = await client.patch(
f"/api/v1/images/{image_id}/tags", f"/api/v1/images/{image_id}/tags",
json={"tags": ["valid", "INVALID TAG WITH SPACES!"]}, json={"tags": ["valid", "INVALID TAG WITH SPACES!"]},
headers=headers,
) )
assert patch.status_code == 422 assert patch.status_code == 422
body = patch.json() body = patch.json()
@@ -103,12 +117,14 @@ async def test_patch_invalid_tag_returns_422(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_list_tags_alphabetical_with_counts(client): async def test_list_tags_alphabetical_with_counts(authed_client):
client, token = authed_client
data = _minimal_png() data = _minimal_png()
await client.post( await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("tag-list-test.png", io.BytesIO(data), "image/png")}, files={"file": ("tag-list-test.png", io.BytesIO(data), "image/png")},
data={"tags": "zebra,apple"}, data={"tags": "zebra,apple"},
headers={"Authorization": f"Bearer {token}"},
) )
response = await client.get("/api/v1/tags") response = await client.get("/api/v1/tags")
assert response.status_code == 200 assert response.status_code == 200
@@ -121,12 +137,14 @@ async def test_list_tags_alphabetical_with_counts(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_list_tags_prefix_filter(client): async def test_list_tags_prefix_filter(authed_client):
client, token = authed_client
data = _minimal_png() data = _minimal_png()
await client.post( await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("prefix-test.png", io.BytesIO(data), "image/png")}, files={"file": ("prefix-test.png", io.BytesIO(data), "image/png")},
data={"tags": "cat,catfish,caterpillar,dog"}, data={"tags": "cat,catfish,caterpillar,dog"},
headers={"Authorization": f"Bearer {token}"},
) )
response = await client.get("/api/v1/tags?q=cat") response = await client.get("/api/v1/tags?q=cat")
assert response.status_code == 200 assert response.status_code == 200
@@ -155,13 +173,16 @@ def _unique_png(seed: int) -> bytes:
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_list_tags_sort_count_desc(client): async def test_list_tags_sort_count_desc(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
# popular-sort-tag appears on 2 images, rare-sort-tag on 1 — verify count_desc ordering # popular-sort-tag appears on 2 images, rare-sort-tag on 1 — verify count_desc ordering
for seed in (100, 101): for seed in (100, 101):
await client.post( await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": (f"sort-{seed}.png", io.BytesIO(_unique_png(seed)), "image/png")}, files={"file": (f"sort-{seed}.png", io.BytesIO(_unique_png(seed)), "image/png")},
data={"tags": "popular-sort-tag,rare-sort-tag" if seed == 100 else "popular-sort-tag"}, data={"tags": "popular-sort-tag,rare-sort-tag" if seed == 100 else "popular-sort-tag"},
headers=headers,
) )
response = await client.get("/api/v1/tags?sort=count_desc") response = await client.get("/api/v1/tags?sort=count_desc")
assert response.status_code == 200 assert response.status_code == 200
@@ -177,13 +198,16 @@ async def test_list_tags_sort_count_desc(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_list_tags_min_count_excludes_below_threshold(client): async def test_list_tags_min_count_excludes_below_threshold(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
# common-min-tag appears on 2 images, uncommon-min-tag on 1 # common-min-tag appears on 2 images, uncommon-min-tag on 1
for seed in (200, 201): for seed in (200, 201):
await client.post( await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": (f"min-{seed}.png", io.BytesIO(_unique_png(seed)), "image/png")}, files={"file": (f"min-{seed}.png", io.BytesIO(_unique_png(seed)), "image/png")},
data={"tags": "common-min-tag,uncommon-min-tag" if seed == 200 else "common-min-tag"}, data={"tags": "common-min-tag,uncommon-min-tag" if seed == 200 else "common-min-tag"},
headers=headers,
) )
# min_count=2 should exclude uncommon-min-tag (count=1) but keep common-min-tag (count=2) # min_count=2 should exclude uncommon-min-tag (count=1) but keep common-min-tag (count=2)
response = await client.get("/api/v1/tags?min_count=2") response = await client.get("/api/v1/tags?min_count=2")

View File

@@ -6,6 +6,7 @@ T029 — file > MAX_UPLOAD_BYTES → 422 file_too_large
T079 — GET /api/v1/images/{id} 404 → error envelope shape T079 — GET /api/v1/images/{id} 404 → error envelope shape
""" """
import io import io
import uuid
from unittest.mock import patch from unittest.mock import patch
import pytest import pytest
@@ -27,11 +28,13 @@ def _minimal_jpeg() -> bytes:
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_upload_new_image_returns_201(client): async def test_upload_new_image_returns_201(authed_client):
client, token = authed_client
data = _minimal_jpeg() data = _minimal_jpeg()
response = await client.post( response = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("test.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("test.jpg", io.BytesIO(data), "image/jpeg")},
headers={"Authorization": f"Bearer {token}"},
) )
assert response.status_code == 201 assert response.status_code == 201
body = response.json() body = response.json()
@@ -44,12 +47,15 @@ async def test_upload_new_image_returns_201(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_upload_duplicate_returns_200_with_flag(client): async def test_upload_duplicate_returns_200_with_flag(authed_client):
client, token = authed_client
data = _minimal_jpeg() data = _minimal_jpeg()
headers = {"Authorization": f"Bearer {token}"}
# First upload # First upload
r1 = await client.post( r1 = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("test.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("test.jpg", io.BytesIO(data), "image/jpeg")},
headers=headers,
) )
assert r1.status_code in (200, 201) assert r1.status_code in (200, 201)
@@ -57,6 +63,7 @@ async def test_upload_duplicate_returns_200_with_flag(client):
r2 = await client.post( r2 = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("test.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("test.jpg", io.BytesIO(data), "image/jpeg")},
headers=headers,
) )
assert r2.status_code == 200 assert r2.status_code == 200
body = r2.json() body = r2.json()
@@ -65,10 +72,12 @@ async def test_upload_duplicate_returns_200_with_flag(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_upload_invalid_mime_type_returns_422(client): async def test_upload_invalid_mime_type_returns_422(authed_client):
client, token = authed_client
response = await client.post( response = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("doc.pdf", io.BytesIO(b"%PDF-1.4"), "application/pdf")}, files={"file": ("doc.pdf", io.BytesIO(b"%PDF-1.4"), "application/pdf")},
headers={"Authorization": f"Bearer {token}"},
) )
assert response.status_code == 422 assert response.status_code == 422
body = response.json() body = response.json()
@@ -77,11 +86,12 @@ async def test_upload_invalid_mime_type_returns_422(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_upload_oversized_file_returns_422(client): async def test_upload_oversized_file_returns_422(authed_client):
import os import os
from app.config import get_settings from app.config import get_settings
client, token = authed_client
os.environ["MAX_UPLOAD_BYTES"] = "10" os.environ["MAX_UPLOAD_BYTES"] = "10"
get_settings.cache_clear() get_settings.cache_clear()
@@ -89,6 +99,7 @@ async def test_upload_oversized_file_returns_422(client):
response = await client.post( response = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("big.jpg", io.BytesIO(b"x" * 11), "image/jpeg")}, files={"file": ("big.jpg", io.BytesIO(b"x" * 11), "image/jpeg")},
headers={"Authorization": f"Bearer {token}"},
) )
assert response.status_code == 422 assert response.status_code == 422
body = response.json() body = response.json()
@@ -100,7 +111,6 @@ async def test_upload_oversized_file_returns_422(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_get_unknown_image_returns_404_with_envelope(client): async def test_get_unknown_image_returns_404_with_envelope(client):
import uuid
response = await client.get(f"/api/v1/images/{uuid.uuid4()}") response = await client.get(f"/api/v1/images/{uuid.uuid4()}")
assert response.status_code == 404 assert response.status_code == 404
body = response.json() body = response.json()
@@ -109,11 +119,13 @@ async def test_get_unknown_image_returns_404_with_envelope(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_upload_returns_thumbnail_key(client): async def test_upload_returns_thumbnail_key(authed_client):
client, token = authed_client
data = _real_jpeg(color=(100, 150, 200)) data = _real_jpeg(color=(100, 150, 200))
response = await client.post( response = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("thumb_test.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("thumb_test.jpg", io.BytesIO(data), "image/jpeg")},
headers={"Authorization": f"Bearer {token}"},
) )
assert response.status_code == 201 assert response.status_code == 201
body = response.json() body = response.json()
@@ -123,17 +135,21 @@ async def test_upload_returns_thumbnail_key(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_duplicate_upload_reuses_thumbnail_key(client): async def test_duplicate_upload_reuses_thumbnail_key(authed_client):
client, token = authed_client
headers = {"Authorization": f"Bearer {token}"}
data = _real_jpeg(color=(200, 100, 50)) data = _real_jpeg(color=(200, 100, 50))
r1 = await client.post( r1 = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("dup.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("dup.jpg", io.BytesIO(data), "image/jpeg")},
headers=headers,
) )
assert r1.status_code in (200, 201) assert r1.status_code in (200, 201)
r2 = await client.post( r2 = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("dup.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("dup.jpg", io.BytesIO(data), "image/jpeg")},
headers=headers,
) )
assert r2.status_code == 200 assert r2.status_code == 200
@@ -144,12 +160,14 @@ async def test_duplicate_upload_reuses_thumbnail_key(client):
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_upload_succeeds_when_thumbnail_fails(client): async def test_upload_succeeds_when_thumbnail_fails(authed_client):
client, token = authed_client
data = _real_jpeg(color=(50, 200, 150)) data = _real_jpeg(color=(50, 200, 150))
with patch("app.routers.images.generate_thumbnail", side_effect=RuntimeError("simulated")): with patch("app.routers.images.generate_thumbnail", side_effect=RuntimeError("simulated")):
response = await client.post( response = await client.post(
"/api/v1/images", "/api/v1/images",
files={"file": ("no_thumb.jpg", io.BytesIO(data), "image/jpeg")}, files={"file": ("no_thumb.jpg", io.BytesIO(data), "image/jpeg")},
headers={"Authorization": f"Bearer {token}"},
) )
assert response.status_code in (200, 201) assert response.status_code in (200, 201)
body = response.json() body = response.json()

67
docker-compose.test.yml Normal file
View File

@@ -0,0 +1,67 @@
services:
postgres-test:
image: postgres:16-alpine
environment:
POSTGRES_USER: reactbin
POSTGRES_PASSWORD: reactbin
POSTGRES_DB: reactbin_test
ports:
- "5433:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready -U reactbin"]
interval: 5s
timeout: 5s
retries: 5
minio-test:
image: minio/minio:latest
command: server /data --console-address ":9001"
environment:
MINIO_ROOT_USER: minioadmin
MINIO_ROOT_PASSWORD: minioadmin
ports:
- "9002:9000"
- "9003:9001"
healthcheck:
test: ["CMD", "mc", "ready", "local"]
interval: 5s
timeout: 5s
retries: 5
minio-init-test:
image: minio/mc:latest
depends_on:
minio-test:
condition: service_healthy
environment:
MINIO_ROOT_USER: minioadmin
MINIO_ROOT_PASSWORD: minioadmin
entrypoint: >
/bin/sh -c "
mc alias set local http://minio-test:9000 $$MINIO_ROOT_USER $$MINIO_ROOT_PASSWORD &&
mc mb --ignore-existing local/reactbin-test
"
api-test:
build:
context: ./api
environment:
TEST_DATABASE_URL: postgresql+asyncpg://reactbin:reactbin@postgres-test:5432/reactbin_test
DATABASE_URL: postgresql+asyncpg://reactbin:reactbin@postgres-test:5432/reactbin_test
S3_ENDPOINT_URL: http://minio-test:9000
S3_BUCKET_NAME: reactbin-test
S3_ACCESS_KEY_ID: minioadmin
S3_SECRET_ACCESS_KEY: minioadmin
S3_REGION: us-east-1
JWT_SECRET_KEY: test-secret-key-for-testing-only
OWNER_USERNAME: testowner
OWNER_PASSWORD: testpassword
API_BASE_URL: http://localhost:8000
MAX_UPLOAD_BYTES: "52428800"
depends_on:
postgres-test:
condition: service_healthy
minio-init-test:
condition: service_completed_successfully
command: ["python", "-m", "pytest", "tests/", "-v"]
working_dir: /app

View File

@@ -0,0 +1,236 @@
# Implementation Plan: PostgreSQL Integration Test Infrastructure
**Branch**: `master` | **Date**: 2026-05-06 | **Spec**: specs/008-postgres-integration-tests/spec.md
**Input**: Feature specification from `specs/008-postgres-integration-tests/spec.md`
---
## Summary
Enforce the constitution's PostgreSQL mandate (§2.5, §5.2 v1.3.0) for integration tests. Three concrete deliverables: (1) a fast-fail guard in `conftest.py` that rejects non-PostgreSQL URLs before any test collects, (2) a `docker-compose.test.yml` that provides isolated `postgres-test` and `minio-test` services and an `api-test` runner, and (3) a `Makefile` + `.env.test.example` that document the canonical test commands.
---
## Technical Context
**Language/Version**: Python 3.12, Docker Compose v2
**Primary Dependencies**: pytest, pytest-asyncio, asyncpg, SQLAlchemy 2.x (all already in `pyproject.toml [dev]`)
**Storage**: PostgreSQL 16-alpine (test instance), MinIO (test instance)
**Testing**: pytest — this feature *is* the test infrastructure change
**Target Platform**: Developer workstation (Linux/macOS) with Docker
**Project Type**: Infrastructure / developer-experience
**Performance Goals**: Guard exits in < 2 s; full integration suite continues to run in < 60 s
**Constraints**: Must not break the existing dev compose stack; no changes to application source code
**Scale/Scope**: One guard, one compose file, one Makefile, one env example
---
## Constitution Check
| Principle | Status | Notes |
|-----------|--------|-------|
| §2.5 Database abstraction — no alternative DB in integration tests | ✅ ENFORCED | This feature implements the enforcement |
| §5.1 TDD — failing test before implementation | ✅ | Guard itself is tested by running with a bad URL before adding the guard |
| §5.2 Test pyramid — integration tests use real PostgreSQL | ✅ ENFORCED | docker-compose.test.yml provides the real instance |
| §5.4 CI must pass before task is done | ✅ | Verified by running the full suite via compose |
| §6 Tech stack — asyncpg driver, Docker Compose | ✅ | No new technologies introduced |
| §7.1 One-command local start | ✅ | `docker compose -f docker-compose.test.yml run --rm api-test` |
| §7.2 Environment config via env vars | ✅ | .env.test.example documents all vars |
| §7.3 Linting not optional | ✅ | ruff will run as part of task validation |
No violations.
---
## Project Structure
### Documentation (this feature)
```text
specs/008-postgres-integration-tests/
├── plan.md ← this file
├── research.md ← decisions made above
├── spec.md ← feature specification
└── tasks.md ← generated by /speckit-tasks
```
### Source changes
```text
# New files
docker-compose.test.yml ← isolated test services + api-test runner
.env.test.example ← documents test environment variables
Makefile ← test-unit / test-integration targets
# Modified files
api/tests/integration/conftest.py ← add postgresql+asyncpg:// dialect guard
```
No application source files (`api/app/`) are modified. No UI files are touched.
---
## Detailed Design
### 1. conftest.py — dialect guard
Add a module-level `pytest_configure` hook at the top of `api/tests/integration/conftest.py`. It resolves the database URL (same logic as the `engine` fixture: prefer `TEST_DATABASE_URL`, fall back to `settings.database_url`) and calls `pytest.exit()` if the scheme is not `postgresql+asyncpg`:
```python
def pytest_configure(config):
import os
db_url = os.getenv("TEST_DATABASE_URL") or os.getenv("DATABASE_URL", "")
if not db_url.startswith("postgresql+asyncpg://"):
pytest.exit(
"Integration tests require a PostgreSQL database "
"(postgresql+asyncpg://...). "
"Set TEST_DATABASE_URL or DATABASE_URL accordingly. "
f"Got: {db_url!r}",
returncode=1,
)
```
The hook runs before any fixture or collection, giving an immediate, unambiguous error.
**Note**: This guard goes in `api/tests/integration/conftest.py` only, not in `api/tests/conftest.py`, so that unit tests (which use no database) are unaffected.
### 2. docker-compose.test.yml
```yaml
services:
postgres-test:
image: postgres:16-alpine
environment:
POSTGRES_USER: reactbin
POSTGRES_PASSWORD: reactbin
POSTGRES_DB: reactbin_test
ports:
- "5433:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready -U reactbin"]
interval: 5s
timeout: 5s
retries: 5
minio-test:
image: minio/minio:latest
command: server /data --console-address ":9001"
environment:
MINIO_ROOT_USER: minioadmin
MINIO_ROOT_PASSWORD: minioadmin
ports:
- "9002:9000"
- "9003:9001"
healthcheck:
test: ["CMD", "mc", "ready", "local"]
interval: 5s
timeout: 5s
retries: 5
minio-init-test:
image: minio/mc:latest
depends_on:
minio-test:
condition: service_healthy
environment:
MINIO_ROOT_USER: minioadmin
MINIO_ROOT_PASSWORD: minioadmin
entrypoint: >
/bin/sh -c "
mc alias set local http://minio-test:9000 $$MINIO_ROOT_USER $$MINIO_ROOT_PASSWORD &&
mc mb --ignore-existing local/reactbin-test
"
api-test:
build:
context: ./api
environment:
TEST_DATABASE_URL: postgresql+asyncpg://reactbin:reactbin@postgres-test:5432/reactbin_test
DATABASE_URL: postgresql+asyncpg://reactbin:reactbin@postgres-test:5432/reactbin_test
S3_ENDPOINT_URL: http://minio-test:9000
S3_BUCKET_NAME: reactbin-test
S3_ACCESS_KEY_ID: minioadmin
S3_SECRET_ACCESS_KEY: minioadmin
S3_REGION: us-east-1
JWT_SECRET_KEY: test-secret-key-for-testing-only
OWNER_USERNAME: testowner
OWNER_PASSWORD: testpassword
API_BASE_URL: http://localhost:8000
MAX_UPLOAD_BYTES: "52428800"
depends_on:
postgres-test:
condition: service_healthy
minio-init-test:
condition: service_completed_successfully
command: ["python", "-m", "pytest", "tests/", "-v"]
working_dir: /app
```
### 3. .env.test.example
Documents the variables needed to run integration tests from the host (with postgres-test and minio-test already running via compose):
```bash
# Integration test environment — used when running pytest directly on the host
# Start test services first: docker compose -f docker-compose.test.yml up -d postgres-test minio-test minio-init-test
TEST_DATABASE_URL=postgresql+asyncpg://reactbin:reactbin@localhost:5433/reactbin_test
DATABASE_URL=postgresql+asyncpg://reactbin:reactbin@localhost:5433/reactbin_test
S3_ENDPOINT_URL=http://localhost:9002
S3_BUCKET_NAME=reactbin-test
S3_ACCESS_KEY_ID=minioadmin
S3_SECRET_ACCESS_KEY=minioadmin
S3_REGION=us-east-1
JWT_SECRET_KEY=test-secret-key-for-testing-only
OWNER_USERNAME=testowner
OWNER_PASSWORD=testpassword
API_BASE_URL=http://localhost:8000
MAX_UPLOAD_BYTES=52428800
```
### 4. Makefile
```makefile
.PHONY: test-unit test-integration
test-unit:
cd api && python -m pytest tests/unit/ -v
test-integration:
docker compose -f docker-compose.test.yml run --rm api-test
```
---
## Phase Breakdown
### Phase 1: Guard (FR-001) — US1
- Write a failing test: run `pytest api/tests/integration/` with `TEST_DATABASE_URL=sqlite+aiosqlite:///test.db` — confirm it does NOT exit early (test that the guard is absent)
- Add `pytest_configure` guard to `api/tests/integration/conftest.py`
- Verify: running with SQLite URL now exits immediately with the correct message
- Verify: running with a PostgreSQL URL proceeds normally
### Phase 2: Docker Compose test stack (FR-002, FR-003) — US2
- Write `docker-compose.test.yml` with `postgres-test`, `minio-test`, `minio-init-test`, `api-test`
- Run `docker compose -f docker-compose.test.yml run --rm api-test` — all tests pass
- Confirm dev stack (port 5432, 9000) is unaffected
### Phase 3: Documentation (FR-004, FR-005) — US3
- Write `.env.test.example`
- Write `Makefile` with `test-unit` and `test-integration`
- Verify `make test-unit` runs unit tests without Docker
- Verify `make test-integration` invokes the compose command
### Phase 4: Polish
- `ruff check api/app/ api/tests/` — zero violations
- `ng lint` is unaffected (no UI changes)
---
## No data model or API contracts
This feature touches only developer tooling. No new API endpoints, database schema changes, or UI components.

View File

@@ -0,0 +1,38 @@
# Quickstart: Integration Test Infrastructure
## Run the full integration test suite (Docker, recommended)
```bash
docker compose -f docker-compose.test.yml run --rm api-test
```
Test services start automatically. The command exits with pytest's return code.
## Run unit tests only (no Docker required)
```bash
make test-unit
# or directly:
cd api && python -m pytest tests/unit/ -v
```
## Run integration tests from the host (test services must be running)
```bash
# Start test services
docker compose -f docker-compose.test.yml up -d postgres-test minio-test minio-init-test
# Copy and source test env vars
cp .env.test.example .env.test
export $(cat .env.test | xargs)
# Run tests
cd api && python -m pytest tests/integration/ -v
```
## Validate the guard works
```bash
TEST_DATABASE_URL=sqlite+aiosqlite:///test.db python -m pytest api/tests/integration/
# Expected: exits immediately with "Integration tests require postgresql+asyncpg://"
```

View File

@@ -0,0 +1,55 @@
# Research: PostgreSQL Integration Test Infrastructure
## Decision 1: How to enforce the PostgreSQL dialect in conftest.py
**Decision**: Add a `pytest_configure` hook (or a module-level guard in `conftest.py`) that calls `pytest.exit()` if the resolved database URL does not start with `postgresql+asyncpg://`.
**Rationale**: `pytest_configure` runs before collection, giving the clearest possible failure signal. A module-level assertion would also work but produces a less readable traceback. `pytest.exit()` with a human-readable message is the idiomatic approach.
**Alternatives considered**:
- A custom pytest plugin in a separate file — unnecessary complexity for a one-liner guard.
- Raising an exception in the `engine` fixture — runs too late (after collection); developers see confusing fixture errors instead of a clear message.
---
## Decision 2: Separate docker-compose.test.yml vs profiles in docker-compose.yml
**Decision**: Use a standalone `docker-compose.test.yml` at the repo root.
**Rationale**: Docker Compose profiles require the developer to remember `--profile test` on every command. A separate file is explicit and self-contained. The test file can define its own service names and ports without touching the dev compose file at all.
**Alternatives considered**:
- `docker-compose.yml` with a `test` profile — profile discovery is non-obvious; modifying the dev file risks breaking the dev stack.
- A `docker-compose.override.yml` — override files apply automatically to `docker compose up`, which is the opposite of what we want for tests.
---
## Decision 3: Port assignments for test services
**Decision**:
- `postgres-test`: host port 5433 (standard offset from dev 5432)
- `minio-test` API: host port 9002 (offset from dev 9000)
- `minio-test` console: host port 9003 (offset from dev 9001)
**Rationale**: Predictable offsets make it easy to remember. Developers running both stacks simultaneously won't hit port conflicts.
---
## Decision 4: S3 isolation strategy for tests
**Decision**: The `api-test` service sets `S3_BUCKET_NAME=reactbin-test` pointing to the dedicated `minio-test` instance. The `minio-init-test` sidecar creates that bucket before tests run.
**Rationale**: The existing conftest already manages database isolation via `create_all` / `drop_all`. MinIO requires bucket pre-creation (same as dev). A dedicated test bucket on a dedicated test MinIO instance gives full isolation. No changes to application storage code are needed.
---
## Decision 5: Makefile vs shell scripts
**Decision**: A `Makefile` at the repo root with `test-unit` and `test-integration` targets.
**Rationale**: `make` is universally available on Linux/macOS developer machines. The targets are short wrappers that document the canonical test invocation. No build logic; just convenience aliases.
**Alternatives considered**:
- Shell scripts (`scripts/test.sh`) — no discoverability; `make help` is more ergonomic.
- `package.json` scripts — wrong tool for a Python/Docker project.
- `justfile` — not universally installed.

View File

@@ -0,0 +1,95 @@
# Feature Specification: PostgreSQL Integration Test Infrastructure
**Feature Branch**: `008-postgres-integration-tests`
**Created**: 2026-05-06
**Status**: Draft
---
## Overview
Integration tests currently permit any SQLAlchemy-compatible database URL, including SQLite. This allowed a real production bug (incorrect `HAVING` without `GROUP BY`) to ship undetected because SQLite's permissive dialect did not reject it. The project constitution (§2.5, §5.2 v1.3.0) now explicitly mandates PostgreSQL for integration tests. This feature enforces that mandate with infrastructure and guardrails.
---
## User Scenarios & Testing
### User Story 1 — Integration tests are enforced to run against PostgreSQL (Priority: P1)
A developer running `pytest` against a non-PostgreSQL database URL receives an immediate, descriptive failure before any test runs.
**Why this priority**: Directly addresses the production bug that prompted this feature. Without this, the constitution mandate has no teeth.
**Independent Test**: Set `TEST_DATABASE_URL=sqlite+aiosqlite:///test.db` and run `pytest api/tests/integration/`. Confirm pytest exits immediately with a message identifying the dialect problem and naming the required scheme.
**Acceptance Scenarios**:
1. **Given** `TEST_DATABASE_URL` is set to a SQLite URL, **When** `pytest api/tests/integration/` is invoked, **Then** pytest exits before collecting any test with an error: `Integration tests require postgresql+asyncpg://`.
2. **Given** `DATABASE_URL` is unset and `TEST_DATABASE_URL` is unset, **When** pytest is invoked, **Then** pytest exits with a clear message about the missing database URL.
3. **Given** `TEST_DATABASE_URL` is a valid `postgresql+asyncpg://` URL, **When** pytest is invoked, **Then** tests collect and run normally.
---
### User Story 2 — One-command integration test run against isolated services (Priority: P1)
A developer can run the entire integration test suite against dedicated, isolated PostgreSQL and MinIO instances with a single command.
**Why this priority**: Without this, the PostgreSQL requirement is mandated but impractical — developers have no easy way to satisfy it.
**Independent Test**: From the repo root with Docker available, run `docker compose -f docker-compose.test.yml run --rm api-test`. Confirm all integration tests pass, test containers start and stop cleanly, and dev database/bucket are untouched.
**Acceptance Scenarios**:
1. **Given** Docker is running and dev services are stopped, **When** the test command is run, **Then** isolated `postgres-test` and `minio-test` services start, all tests run against them, and the command exits with pytest's return code.
2. **Given** dev services are running on their normal ports, **When** the test command is run, **Then** test services use different ports (5433, 9002/9003) and do not interfere with the dev stack.
3. **Given** any test data is written during the run, **When** the test run completes, **Then** all test schema is dropped (conftest teardown is unchanged).
---
### User Story 3 — Test infrastructure is documented (Priority: P2)
A developer new to the project can understand how to run unit tests vs integration tests without reading the source code.
**Independent Test**: Read `.env.test.example` and `Makefile`. Confirm all required environment variables are documented and `make test-unit` / `make test-integration` targets are present.
**Acceptance Scenarios**:
1. **Given** a fresh clone, **When** the developer reads `.env.test.example`, **Then** they see every variable needed to run integration tests outside Docker, with example values.
2. **Given** the Makefile, **When** the developer runs `make test-unit`, **Then** the pytest unit suite runs without requiring Docker.
3. **Given** the Makefile, **When** the developer runs `make test-integration`, **Then** the Docker Compose test command runs.
---
### Edge Cases
- What if `TEST_DATABASE_URL` is set but malformed? — The guard should still catch a non-PostgreSQL scheme; asyncpg will raise its own error for a malformed URL.
- What if Docker is not available? — `make test-integration` fails at the Docker level with Docker's own error; the Makefile does not need to guard for this.
- What if the test PostgreSQL port (5433) is already in use? — Standard Docker port conflict error; no special handling needed.
---
## Requirements
### Functional Requirements
- **FR-001**: `conftest.py` MUST assert the resolved database URL starts with `postgresql+asyncpg://` and call `pytest.exit()` with a descriptive message before any test collects.
- **FR-002**: A `docker-compose.test.yml` MUST define isolated `postgres-test` (port 5433) and `minio-test` (ports 9002/9003) services and an `api-test` runner service.
- **FR-003**: The `api-test` service MUST set `TEST_DATABASE_URL` pointing to `postgres-test` and all S3 env vars pointing to `minio-test`.
- **FR-004**: A `.env.test.example` MUST document all environment variables required to run integration tests outside Docker.
- **FR-005**: A `Makefile` MUST provide `test-unit` and `test-integration` targets.
---
## Success Criteria
- **SC-001**: Running `pytest api/tests/integration/` with a SQLite URL exits in under 2 seconds with a clear error message — no tests run.
- **SC-002**: `docker compose -f docker-compose.test.yml run --rm api-test` completes successfully with all integration tests passing.
- **SC-003**: Dev services (postgres on 5432, minio on 9000) are unaffected when the test command runs.
---
## Assumptions
- Docker Compose v2 (`docker compose`) is available in the developer environment.
- The existing `conftest.py` `engine` fixture (session-scoped `create_all` / `drop_all`) continues to handle schema lifecycle; no per-test transaction rollback mechanism is introduced.
- CI/CD pipeline configuration is out of scope for this feature.

View File

@@ -0,0 +1,113 @@
# Tasks: PostgreSQL Integration Test Infrastructure
**Input**: Design documents from `specs/008-postgres-integration-tests/`
**Prerequisites**: plan.md ✅, spec.md ✅, research.md ✅, quickstart.md ✅
**Tests**: TDD is non-negotiable (§5.1). For infrastructure tasks the "failing test" is a verification step that confirms the thing being built is absent before building it, then confirms it works after. Every user story has an explicit TDD red step before its implementation task.
**Organization**: No foundational blocking phase — all three user stories touch independent files and can proceed in order.
## Format: `[ID] [P?] [Story] Description`
- **[P]**: Can run in parallel with other [P] tasks in the same phase
- **[Story]**: Which user story this task belongs to
- Exact file paths included in every task description
---
## Phase 1: Setup
No new project structure required. The existing layout accommodates all changes.
---
## Phase 2: User Story 1 — Dialect guard in conftest (Priority: P1) 🎯 MVP
**Goal**: `pytest api/tests/integration/` exits immediately with a clear message if the database URL is not `postgresql+asyncpg://`.
**Independent Test**: Run `TEST_DATABASE_URL=sqlite+aiosqlite:///test.db python -m pytest api/tests/integration/ -q` — command exits in < 2 s with the error message `Integration tests require postgresql+asyncpg://` and no tests are collected.
- [X] T001 [US1] Confirm guard is absent (TDD red): from `api/`, run `TEST_DATABASE_URL=sqlite+aiosqlite:///test.db python -m pytest tests/integration/ -q --co 2>&1 | head -20` — observe that tests ARE collected and note the count (guard not yet in place)
- [X] T002 [US1] Add `pytest_configure` hook to `api/tests/integration/conftest.py` — resolve URL via `os.getenv("TEST_DATABASE_URL") or os.getenv("DATABASE_URL", "")`, call `pytest.exit("Integration tests require postgresql+asyncpg://...", returncode=1)` if URL does not start with `postgresql+asyncpg://`; place hook before any imports that depend on the database URL
- [X] T003 [US1] Verify guard works (TDD green): run `TEST_DATABASE_URL=sqlite+aiosqlite:///test.db python -m pytest api/tests/integration/ -q` — confirm immediate exit with the correct error message and zero tests collected; also confirm a valid `postgresql+asyncpg://` URL does not trigger the guard
**Checkpoint**: Dialect-mismatched test runs are blocked before any test collects.
---
## Phase 3: User Story 2 — Docker Compose test stack (Priority: P1)
**Goal**: `docker compose -f docker-compose.test.yml run --rm api-test` runs the full integration suite against isolated PostgreSQL and MinIO services on different ports than the dev stack.
**Independent Test**: Run `docker compose -f docker-compose.test.yml run --rm api-test` from the repo root — all tests pass; verify `docker compose ps` shows dev services (if running) are unaffected on their original ports.
- [X] T004 [US2] Confirm compose file is absent (TDD red): run `test -f docker-compose.test.yml && echo EXISTS || echo ABSENT` — confirm output is `ABSENT`
- [X] T005 [US2] Create `docker-compose.test.yml` at the repo root with four services: `postgres-test` (image `postgres:16-alpine`, host port 5433, db `reactbin_test`), `minio-test` (image `minio/minio:latest`, host ports 9002/9003), `minio-init-test` (creates bucket `reactbin-test`, depends on `minio-test` healthy), and `api-test` (builds from `./api`, runs `python -m pytest tests/ -v`, depends on `postgres-test` healthy and `minio-init-test` completed, environment sets `TEST_DATABASE_URL=postgresql+asyncpg://reactbin:reactbin@postgres-test:5432/reactbin_test`, `DATABASE_URL` to same value, and all S3 vars pointing to `minio-test:9000` with bucket `reactbin-test`) — follow exact design in `specs/008-postgres-integration-tests/plan.md`
- [X] T006 [US2] Verify compose stack (TDD green): run `docker compose -f docker-compose.test.yml run --rm api-test` — confirm all integration tests pass; confirm no errors about missing env vars or connection failures
**Checkpoint**: Full integration suite runs against real PostgreSQL via one command.
---
## Phase 4: User Story 3 — Test documentation (Priority: P2)
**Goal**: `.env.test.example` and `Makefile` document how to run both test tiers.
**Independent Test**: Read `.env.test.example` — all variables needed for integration tests are present with example values. Run `make test-unit` — pytest unit suite runs without Docker and passes.
- [X] T007 [P] [US3] Create `.env.test.example` at the repo root documenting all variables required to run integration tests outside Docker: `TEST_DATABASE_URL`, `DATABASE_URL`, `S3_ENDPOINT_URL`, `S3_BUCKET_NAME`, `S3_ACCESS_KEY_ID`, `S3_SECRET_ACCESS_KEY`, `S3_REGION`, `JWT_SECRET_KEY`, `OWNER_USERNAME`, `OWNER_PASSWORD`, `API_BASE_URL`, `MAX_UPLOAD_BYTES` — with example values pointing to `localhost:5433` and `localhost:9002` (test service ports); include a comment explaining how to start test services first — follow exact design in `specs/008-postgres-integration-tests/plan.md`
- [X] T008 [P] [US3] Create `Makefile` at the repo root with `.PHONY: test-unit test-integration`, `test-unit` target running `cd api && python -m pytest tests/unit/ -v`, and `test-integration` target running `docker compose -f docker-compose.test.yml run --rm api-test`
- [X] T009 [US3] Verify `make test-unit` — unit tests pass without Docker (validates the Makefile target and confirms unit tests have no Docker dependency)
- [X] T010 Verify `make test-integration` — Docker integration suite passes end-to-end (cross-story verification: exercises the US2 compose stack via the US3 Makefile target)
**Checkpoint**: All three user stories independently functional.
---
## Phase 5: Polish & Cross-Cutting Concerns
- [X] T011 Run `ruff check api/app/ api/tests/` — zero violations (conftest change must pass ruff; fix any issues)
---
## Dependencies & Execution Order
### Phase Dependencies
- **Phase 2 (US1)**: No external dependencies — can start immediately
- **Phase 3 (US2)**: Depends on Phase 2 (guard must be in place so the compose stack run exercises it)
- **Phase 4 (US3)**: T007 and T008 are independent file writes (can run in parallel with each other after Phase 3); T009 requires T008; T010 requires T008 and T006
- **Phase 5 (Polish)**: Depends on all prior phases
### Within Phase 4
- T007 ∥ T008 (different files, no dependency)
- T009 after T008 (Makefile must exist)
- T010 after T008 and T006 (requires both Makefile and compose stack)
### Execution Order Summary
```
Step 1: T001, T002, T003 (sequential — TDD for guard)
Step 2: T004, T005, T006 (sequential — TDD for compose stack)
Step 3 (parallel): T007, T008
Step 4: T009 (after T008), T010 (after T008 + T006)
Step 5: T011
```
---
## Implementation Strategy
### MVP (US1 — the guard)
1. Complete T001T003
2. **Validate**: SQLite URL is blocked; PostgreSQL URL proceeds
3. US2 and US3 add the infrastructure to make the mandate practical
### Incremental Delivery
- After Phase 2: Dialect bugs are caught immediately — core safety net is in place
- After Phase 3: Full integration suite runs against PostgreSQL via one Docker command
- After Phase 4: Both test tiers are documented and accessible via `make`
- After Phase 5: Lint clean, ready for merge