Learn from OSS
Open SaaS

Workflows & Data Flows

Key user journeys in Open SaaS — signup, payments, AI features, file uploads, and admin

Workflows & Data Flows

This page traces the most important user journeys through Open SaaS — from user actions in the browser through the backend and external services. Understanding these flows reveals how a modern SaaS application handles real-world business logic.

Reading Guide

Each workflow shows the complete data path from user action to final result. Pay attention to where external services (Stripe, S3, OpenAI) enter the picture — these are the integration points that make SaaS apps complex.

How a Request Flows Through Open SaaS

The general pattern for every user interaction:

Browser (User Action)


React Component           ← user interface
    │ calls useQuery / useAction

Wasp RPC Layer            ← auto-generated, type-safe
    │ HTTP request

Node.js Server Function   ← business logic

    ├──► Prisma (PostgreSQL)   ← data operations
    ├──► External APIs         ← Stripe, OpenAI, S3
    └──► PgBoss                ← queue background jobs


Response → React Component ← auto-typed, triggers re-render

Workflow 1: User Signup & Onboarding

User visits signup page

The React frontend renders the signup form at /signup. Wasp provides the auth UI components automatically.

User submits email + password

The form calls Wasp's built-in auth action. Wasp handles:

  1. Password hashing (bcrypt)
  2. User record creation in PostgreSQL
  3. Admin check (compares email against ADMIN_EMAILS env var)
  4. Default credits assignment (3 credits)

Email verification sent

Wasp triggers the email verification flow:

  1. Generates a verification token
  2. Calls the custom email template function
  3. Sends email via configured provider (SendGrid/Mailgun)
  1. Token validated against the database
  2. User marked as emailVerified: true
  3. Redirected to the app with an active session

Session established

Wasp creates a session cookie. All subsequent requests include the session, and Wasp auto-injects the user object into every server-side query/action.

For PMs: Why Email Verification?

Email verification prevents fake signups, ensures deliverability for transactional emails (password resets, receipts), and is required by payment processors. Without it, bots could create thousands of accounts and abuse the free credit system.

Workflow 2: Subscription Payment (Stripe)

This is the core monetization flow — converting a free user to a paying subscriber.

User clicks "Upgrade to Pro"

The pricing page displays plans with a "Subscribe" button. Clicking calls the createCheckoutSession action.

Backend creates a Stripe Checkout Session

Node.js Action:
  1. Check if user already has a Stripe customer ID
  2. If not, create a Stripe customer (deduplicates by email)
  3. Create a Checkout Session with:
     - Plan price ID (from env vars)
     - Customer ID
     - Success/cancel redirect URLs
     - Automatic tax collection enabled
     - Promo code support
  4. Return the Checkout Session URL

User completes payment on Stripe

The user is redirected to Stripe's hosted checkout page. They enter payment details. Stripe handles PCI compliance — Open SaaS never sees raw card numbers.

Stripe fires webhook → Open SaaS processes it

Stripe sends POST /payments-webhook


Middleware verifies HMAC signature


Event: "invoice.paid"

    ├── Is it a subscription payment?
    │     → Update user: subscriptionStatus = "active"
    │                     subscriptionPlan = "pro"
    │                     datePaid = now()

    └── Is it a credits purchase?
          → Increment user.credits by 10

User returns to app with active subscription

The success redirect URL loads. The frontend queries the user object, sees subscriptionStatus: "active", and unlocks premium features.

Why Webhooks Instead of Polling?

After payment, the user is redirected back to the app. But Stripe's payment confirmation is asynchronous — it might take seconds. Instead of polling Stripe, the webhook fires the moment payment is confirmed and updates the database. By the time the user's browser loads, their subscription is already active.

Workflow 3: AI Schedule Generation (Credits)

The demo app shows how to monetize AI features with a credit system.

User creates tasks

On the demo page, the user adds tasks with descriptions and time estimates. These are saved to the Task table via Wasp actions.

User clicks "Generate Schedule"

The frontend calls the generateSchedule action.

Backend checks credits

if (user.subscriptionStatus === 'active') {
  // Subscriber — unlimited access, skip credit check
} else if (user.credits > 0) {
  // Free user with credits — deduct 1 (atomic operation)
  await prisma.user.update({
    where: { id: user.id },
    data: { credits: { decrement: 1 } }
  });
} else {
  // No credits, no subscription — return 402
  throw new HttpError(402, 'Not enough credits');
}

Backend calls OpenAI

const completion = await openai.chat.completions.create({
  model: "gpt-3.5-turbo",
  messages: [
    { role: "system", content: "You are a scheduling assistant..." },
    { role: "user", content: JSON.stringify(tasks) }
  ],
  functions: [{
    name: "createSchedule",
    parameters: { /* structured output schema */ }
  }],
  function_call: "auto"
});

OpenAI returns a structured JSON schedule via function calling (not free-text).

Response stored and displayed

The AI response is saved to the GptResponse table and returned to the frontend, which renders the schedule.

Why Function Calling?

Regular GPT responses are free-text — hard to parse reliably. Function calling tells the AI to return data in a specific JSON schema. The AI "calls a function" with structured parameters, giving the app predictable, parseable output every time. No regex parsing of AI text needed.

Workflow 4: File Upload (S3 Presigned URL)

User selects a file

The React component captures the file from an <input type="file"> element.

Frontend requests a presigned upload URL

Calls a Wasp action that generates an S3 presigned URL:

Backend:
  1. Generate a unique S3 key (e.g., "uploads/{userId}/{uuid}/{filename}")
  2. Create a presigned PUT URL (valid for 5 minutes)
  3. Return { uploadUrl, s3Key }

Browser uploads directly to S3

The frontend PUTs the file to the presigned URL. The file goes directly from the browser to AWS S3 — it never touches the Node.js server.

File metadata saved to database

After successful upload, the frontend calls another action to save:

{ name: "report.pdf", type: "application/pdf", s3Key: "uploads/..." }

File available for download

To retrieve the file, the backend generates a presigned GET URL from the stored S3 key and returns it to the frontend.

Workflow 5: Admin Dashboard Analytics

Background job runs daily (PgBoss cron)

Wasp schedules a cron job that runs every 24 hours:

job dailyStatsJob {
  executor: PgBoss,
  schedule: { cron: "0 0 * * *" },   // midnight daily
  perform: { fn: calculateDailyStats }
}

Stats aggregation

The job queries:

  • Total page views (from analytics provider or custom tracking)
  • User count (total and paid)
  • Revenue (from payment records)
  • User deltas (new signups, new paid, churned)

Results stored in DailyStats

DailyStats {
  date: "2024-01-15",
  totalViews: 1234,
  userCount: 567,
  paidUserCount: 89,
  userDelta: 12,
  paidUserDelta: 3,
  totalRevenue: 1599.00,
  totalProfit: 1279.00
}

Admin views dashboard

The admin dashboard (isAdmin check) queries DailyStats and renders charts showing trends over time.

Information Flow Summary

FromToMechanismExample
Browser → BackendWasp RPC (HTTP)Creating tasks, requesting AI
Backend → DatabasePrisma ORMAll data operations
Backend → StripeREST APICreating checkout sessions
Stripe → BackendWebhook (POST)Payment confirmations
Backend → OpenAIREST APIAI text generation
Browser → S3Presigned PUT URLFile uploads (direct)
Backend → EmailSMTP/APIVerification, receipts
PgBoss → BackendJob queueCron jobs, scheduled tasks

What's Next