- Replace vmsg WASM encoder with native MediaRecorder API (WebM/Opus)
to fix empty MP3 files causing OpenAI Whisper 400 errors
- Add minimum recording duration (2s) and file size (5KB) guards
- Add MinIO S3 storage integration for recipe images and audio
- Add /uploads/* API route that proxies files from MinIO with local fallback
- Save audio locally first for transcription, then upload to MinIO
(fixes ECONNREFUSED when backend tried to fetch its own public URL)
- Add docker-compose.prod.yml, nginx-prod.conf, frontend Dockerfile
- Frontend Dockerfile: no-cache headers on index.html, long cache on hashed assets
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Before: 'Gérer l'abonnement' opened the generic Customer Portal. If the
user cancelled, the portal's 'return_url' was just a button label —
nothing auto-redirected back to Freedge, so the user was stranded on
billing.stripe.com after clicking 'Cancel'.
Now: dedicated 'Annuler' button on the Profile SubscriptionCard that
calls a new backend endpoint POST /stripe/portal/cancel. This creates
a portal session with flow_data.type = 'subscription_cancel' deep-linked
to the user's active subscription, plus after_completion.type = 'redirect'
so Stripe automatically redirects to /subscription/cancelled when the
cancellation is confirmed.
New page /subscription/cancelled:
- Animated heart badge (spring + pulsing halo)
- 'À bientôt, on l'espère' title
- Info box showing the period-end date (fetched via sync on mount)
so the user knows they still have access until the end of the
already-paid period
- Re-engagement message + 'Retour aux recettes' / 'Voir les plans' CTAs
- On mount: calls /stripe/sync so the DB is updated immediately
(doesn't wait for the customer.subscription.updated webhook)
Profile SubscriptionCard paid-state footer now has two buttons side by
side: 'Gérer' (outline) and 'Annuler' (ghost with red hover).
Backend verified: Stripe SDK v12 supports flow_data.after_completion.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Problem: if stripe listen is not running (dev) or the webhook secret is
misconfigured, a successful checkout leaves the user stuck on the free
plan in the DB even though Stripe knows they're subscribed.
Solution: 3 recovery mechanisms.
1. Backend: POST /stripe/sync (auth required)
Fetches the current user's subscriptions from Stripe by customer ID,
picks the most recent active/trialing/past_due one, and applies it to
the User row via the same applySubscriptionToUser helper used by the
webhook. If no active sub exists, downgrades to free. Returns the
current plan state.
2. Frontend: CheckoutSuccess now calls /stripe/sync first (instant,
reliable) before falling back to polling /stripe/subscription. This
fixes the 'just paid but still free' bug even with no webhook setup.
3. Frontend: 'Rafraîchir' button on the Profile free-plan upgrade banner
(ghost style with RefreshCw spinning icon). Tooltip hints at its
purpose. Users who paid but see the free state can click it to
self-heal in one click.
4. Backend script: scripts/sync-subscription.ts
- npm run stripe:sync -- user@example.com (sync one user by email)
- npm run stripe:sync -- --all (sync every user with a
stripeId, useful after
a prod webhook outage)
Colored output with ✓ / ✗ / ↷ status per user.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
New script backend/scripts/setup-stripe.ts that:
- Reads STRIPE_SECRET_KEY from .env
- Detects test vs live mode and warns + 5s delay for live
- For each plan (Essentiel 3EUR/mo, Premium 5EUR/mo):
- Looks up existing price by lookup_key (freedge_essential_monthly,
freedge_premium_monthly) — idempotent, safe to re-run
- If missing, creates the product then the recurring price with the
lookup_key and nickname for clarity
- Prints the resulting price IDs with their env var names
- With --write-env flag, automatically upserts the values into
backend/.env preserving other lines
- Points to Customer Portal settings and stripe listen command as
next steps
npm scripts added:
- npm run stripe:setup # dry run, just print IDs
- npm run stripe:setup:write # update .env automatically
- npm run stripe:listen # shortcut for stripe CLI webhook forward
Updated README to show the script as the recommended path for step 1,
keeping the manual dashboard instructions as a fallback.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Backend:
- Prisma: add stripeSubscriptionId, subscriptionStatus, priceId,
currentPeriodEnd to User + migration SQL
- plugins/stripe.ts: getPlans catalog with env-based price IDs
- server.ts: raw body JSON parser for webhook signature verification,
skip rate limit on /stripe/webhook
- types/fastify.d.ts: declare rawBody on FastifyRequest
- routes/stripe.ts (new):
- GET /stripe/plans public
- GET /stripe/subscription user status
- POST /stripe/checkout hosted Checkout Session, lazy-creates
customer, dynamic payment methods, promo codes enabled
- POST /stripe/portal Billing Portal session
- POST /stripe/webhook signature verified, handles
checkout.session.completed, customer.subscription.*,
invoice.payment_failed. Resolves user by clientReferenceId,
metadata.userId, or stripeId fallback
- .env.example + README: Stripe setup, stripe CLI, test cards
Frontend:
- api/stripe.ts typed client (getPlans, getSubscription,
startCheckout, openPortal)
- pages/Pricing.tsx: 3-card grid (free/essentiel/premium) with
popular badge, current plan indicator, gradient popular card
- pages/CheckoutSuccess.tsx: animated confirmation with polling on
/stripe/subscription until webhook activates plan
- pages/Profile.tsx: SubscriptionCard above tabs — free users see an
upgrade banner, paid users see plan + status + next billing date
+ 'Gérer l'abonnement' button opening Customer Portal
- components/header.tsx: 'Tarifs' link in nav
- App.tsx: /pricing (public) and /checkout/success (protected) routes
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- reply.hijack() so Fastify doesn't send default 404 after handler returns
- Set Access-Control-Allow-Origin manually (onSend hooks don't fire on raw)
- Initial ': ok' comment line to flush headers immediately
- Guard send('error') in case stream already closed
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Backend:
- Prisma: add user preferences (dietaryPreference, allergies, maxCookingTime,
equipment, cuisinePreference, servingsDefault) + migration SQL.
Also make stripeId nullable so signup works without Stripe.
- prompts.ts: buildUserPrompt now takes a BuildPromptOptions with preferences.
Injects strong, explicit constraints in the user message (vegan rules,
allergy warnings, time limits, equipment availability, cuisine hints).
- recipe-generator.ts: new streamRecipe() async generator. Streams OpenAI
chat completion with json_schema strict mode, parses the growing buffer
to detect 'titre' and 'description' early, yields typed events:
{ type: 'delta' | 'title' | 'description' | 'complete' }
Final event includes the parsed StructuredRecipe + cost log.
- recipes.ts route: new POST /recipes/create-stream returning SSE:
event: progress { step }
event: transcription{ text }
event: title { title } <- triggers parallel image gen
event: description { description }
event: recipe { recipe }
event: image { url }
event: saved { id }
event: done
Heartbeat every 15s to prevent proxy timeouts. Image generation is
kicked off the moment the title is extracted, running in parallel with
the rest of the recipe stream. Legacy POST /recipes/create still works
and now also passes user preferences.
- users.ts route: GET /profile now returns preferences (equipment
deserialized from JSON). New PUT /users/preferences with validation
(diet enum, time 5-600, servings 1-20, equipment array -> JSON).
- ai.ts plugin: generateRecipe signature extended to accept preferences.
Frontend:
- api/recipe.ts: createRecipeStream() async generator that consumes SSE
via fetch + ReadableStream + TextDecoder (EventSource can't do POST).
Parses 'event:' and 'data:' lines, yields typed StreamEvent union.
- api/auth.ts: User interface extended with preferences; new
UserPreferences type exported.
- api/user.ts: updatePreferences() method.
- RecipeForm.tsx: handleSubmit now consumes the stream. Live UI displays:
1. Initial cooking loader with step label
2. Transcription appears as soon as it's ready
3. Title fades in the moment it's extracted (before the rest of the
recipe finishes generating)
4. Description appears right after
5. Image replaces the loader when ready
6. Small delay before navigating to the saved recipe detail page
- Profile.tsx: new 'Cuisine' tab with form for diet, allergies, max time,
servings, cuisine preference, and equipment checkboxes (8 options).
UX improvement: perceived latency is dramatically reduced. Instead of
waiting 40s staring at a spinner, the user sees the title ~3-4s in and
can start reading, while the image finishes generating in parallel.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Helmet's default 'Cross-Origin-Resource-Policy: same-origin' header was
blocking the frontend (http://localhost:5173) from loading images and
audio served by the backend at /uploads/*. Set policy to 'cross-origin'
so images can be embedded in the frontend.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Image generation:
- Automatic model fallback: try gpt-image-1 first, fall back to
dall-e-3 if it fails (e.g. org not verified on OpenAI)
- Local filesystem fallback: if MinIO upload fails, write the image
to backend/uploads/recipes/ and return a URL served by fastify-static
- Unified handling of base64 vs URL responses from the Images API
- DALL-E quality mapped automatically (low/medium/high -> standard)
Local MinIO stack:
- docker-compose.yml at repo root with minio + minio-init service
that auto-creates the bucket and makes it publicly readable
- Default credentials: freedge / freedge123 (configurable)
- Console at :9001, API at :9000
- .env.example now points to the local stack by default
Static file serving:
- Register @fastify/static to serve ./uploads at /uploads/*
- Enables local fallback to return usable URLs to the frontend
- New PUBLIC_BASE_URL env var to build absolute URLs
TypeScript errors (21 -> 0):
- JWT typing via '@fastify/jwt' module augmentation (FastifyJWT
interface with payload + user) fixes all request.user.id errors
- Stripe constructor now passes required StripeConfig
- fastify.createCustomer guards checked on the helper itself for
proper TS narrowing (not on fastify.stripe)
- Remove 'done' arg from async onClose hook
- MinIO transport.agent + listFiles return type cast ignored
README:
- Add 'Stockage des fichiers' section explaining the two modes
- Updated setup instructions to start with docker compose up
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Overhaul of the backend AI module to produce better recipes, better
images, more reliably, and cheaper.
New src/ai/ module:
- prompts.ts: long 'Chef Antoine' system prompt (~1500 tokens) with
explicit originality rules, technical precision requirements, vocal
transcription handling, and 3 few-shot style examples. Long enough
to benefit from OpenAI's automatic prompt caching (-50% on cached
portion from the 2nd call onward).
- recipe-generator.ts: uses Structured Outputs (json_schema strict).
Rich schema: titre, description, origine_inspiration, ingredients
with quantity/notes/complement flag, numbered etapes with per-step
duration, conseils array, accord_boisson. No more JSON.parse crashes.
- image-generator.ts: switched from dall-e-3 to gpt-image-1 (medium
quality by default). Much better photographic realism. Dedicated
magazine-style prompt (editorial food photography, 45-deg overhead,
natural light, stoneware). Slugify preserves extended Latin chars
(cote-de-boeuf not c-te-de-b-uf).
- transcriber.ts: migrated from whisper-1 to gpt-4o-mini-transcribe
(50% cheaper, better on French). Includes a context prompt to bias
toward culinary vocabulary.
- cost.ts: centralized pricing table + helpers. Every OpenAI call now
emits a structured log with model, durationMs, costUsd, usage, and
cacheHit flag.
Plugin refactor:
- plugins/ai.ts now delegates to src/ai/* and only keeps the Fastify
decoration glue + storage fallback for audio.
- OpenAI client configured with maxRetries=3, timeout=60s.
- Image generation runs in parallel with the recipe flatten/serialize
step (minor speedup, ~0.5s).
- flattenRecipe() converts the rich structured recipe into the legacy
flat RecipeData shape (for Prisma columns) while preserving the
structured form in recipeData.structured.
Routes:
- recipes.ts stores the structured JSON in generatedRecipe (instead
of the aplatissement lossy), enabling future frontends to render
rich recipes with per-ingredient notes and step timers.
Env vars:
- OPENAI_TRANSCRIBE_MODEL, OPENAI_IMAGE_MODEL, OPENAI_IMAGE_QUALITY,
OPENAI_IMAGE_SIZE, OPENAI_MAX_RETRIES, OPENAI_TIMEOUT_MS
Cost per recipe (estimated):
- Before: ~$0.044 (whisper $0.003 + 4o-mini $0.0004 + dall-e-3 $0.04)
- After : ~$0.018 (4o-mini-transcribe $0.0015 + 4o-mini $0.0004
+ gpt-image-1 medium $0.0165), ~-59%.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>