March 21, 2026 · 6 min read

Build a Health Agent with Perplexity and Sahha

Use Sahha webhooks and Perplexity Sonar to build an AI health agent that delivers research-grounded, structured health insights for your app.

This guide shows you how to take a live Sahha webhook event, pass the relevant Sahha data into Perplexity, and return a structured output that your app can display or act on.

What you are building

The integration flow is:

Sahha SDK in your app

Sahha generates data

Sahha webhook sends event to your backend

Your backend extracts and formats the payload

Perplexity generates an output

Your app stores or displays the result

A common use case is turning fresh Sahha data into:

  • a daily summary
  • a personalised engagement message
  • a short lifestyle insight
  • a structured JSON payload for your UI

Prerequisites

Before you start, make sure you already have:

  • Sahha SDK integrated into your app
  • a live Sahha webhook configured and receiving events
  • a backend endpoint that can receive webhook payloads
  • a Perplexity API key

Keep Perplexity on your server, not in your client app.

Your backend should:

  1. receive the Sahha webhook
  2. validate and parse the incoming payload
  3. extract the Sahha fields you want to use
  4. build a prompt for Perplexity
  5. request a structured output
  6. store or return the result

Why use Perplexity here?

Perplexity’s Sonar API is useful when you want generated output that can combine your Sahha context with grounded reasoning and predictable formatting.

For this workflow, structured outputs are especially useful because they let you return machine-readable JSON that can be rendered directly in your product.

Example output shape

In this example, we will ask Perplexity to return:

{
  "headline": "Poor recovery trend detected",
  "summary": "The user appears to have had reduced recovery over the last 3 days.",
  "recommendation": "Reduce cognitive load tonight and prioritise an earlier sleep window.",
  "tone": "supportive"
}

1. Create a webhook endpoint

Below is a minimal Node.js example using Express. You can adapt the same pattern for Next.js, Fastify, Cloudflare Workers, or your preferred backend framework.

import express from 'express'

const app = express()
app.use(express.json())

app.post('/webhooks/sahha', async (req, res) => {
  try {
    const payload = req.body

    // Optional: validate webhook authenticity here.
    // Optional: ignore event types you do not want to process.

    const result = await generatePerplexityOutput(payload)

    // Save the result to your database, send it to your app,
    // or trigger the next step in your workflow.
    console.log(result)

    return res.status(200).json({ ok: true, result })
  } catch (error) {
    console.error(error)
    return res.status(500).json({ ok: false, error: 'Failed to process webhook' })
  }
})

app.listen(3000, () => {
  console.log('Listening on port 3000')
})

2. Extract the Sahha data you want to send

Your webhook payload may contain more data than you need. In most cases, you should reduce it to only the fields that matter for the output you want to generate.

For example:

function mapSahhaPayload(payload: any) {
  return {
    profileId: payload.profileId,
    timestamp: payload.timestamp,
    scores: payload.scores,
    biomarkers: payload.biomarkers,
    // Add or remove fields based on your webhook payload shape.
  }
}

The goal is to give Perplexity enough context to generate a strong answer without sending unnecessary raw data.

3. Send the Sahha context to Perplexity

This example uses the Perplexity Sonar API and requests a structured JSON response.

Create a helper like this:

const PERPLEXITY_API_KEY = process.env.PERPLEXITY_API_KEY

async function generatePerplexityOutput(payload: any) {
  const sahha = mapSahhaPayload(payload)

  const response = await fetch('https://api.perplexity.ai/v1/sonar', {
    method: 'POST',
    headers: {
      'Authorization': `Bearer ${PERPLEXITY_API_KEY}`,
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({
      model: 'sonar-pro',
      messages: [
        {
          role: 'system',
          content:
            'You are a behavioural health assistant. Return concise, safe, supportive outputs for use inside a product experience.'
        },
        {
          role: 'user',
          content: [
            'You will receive Sahha webhook data from a health and lifestyle application.',
            'Using only the data below, create a short output for the end user.',
            'Return valid JSON with the fields: headline, summary, recommendation, tone.',
            '',
            'Sahha data:',
            JSON.stringify(sahha, null, 2),
          ].join('\n')
        }
      ],
      response_format: {
        type: 'json_schema',
        json_schema: {
          schema: {
            type: 'object',
            additionalProperties: false,
            properties: {
              headline: { type: 'string' },
              summary: { type: 'string' },
              recommendation: { type: 'string' },
              tone: { type: 'string' }
            },
            required: ['headline', 'summary', 'recommendation', 'tone']
          }
        }
      }
    })
  })

  if (!response.ok) {
    const errorText = await response.text()
    throw new Error(`Perplexity request failed: ${errorText}`)
  }

  const data = await response.json()

  return {
    output: JSON.parse(data.choices[0].message.content),
    citations: data.citations ?? [],
    searchResults: data.search_results ?? []
  }
}

4. Return or store the output

Once Perplexity responds, you can:

  • store the output in your database
  • attach it to a user timeline or feed
  • send it to your frontend in real time
  • use it to trigger another workflow

Example response object:

{
  "output": {
    "headline": "Poor recovery trend detected",
    "summary": "Your recent data suggests recovery has trended lower over the last few days.",
    "recommendation": "Aim for a lighter evening and an earlier sleep window tonight.",
    "tone": "supportive"
  },
  "citations": [],
  "searchResults": []
}

Prompting tips

A few implementation details matter here:

Put the Sahha context in the user message

Perplexity uses the user message to drive the main query. That means your actual Sahha-derived context and instructions should live in the user message, not only in the system message.

Keep the output schema tight

If the response is going into a product UI, define a strict JSON schema and keep the number of fields small.

Be specific about tone and length

If you want a brief product-ready output, say so clearly in the prompt.

For example:

Write a supportive summary in under 40 words.
Avoid medical claims.
Do not diagnose.

If you want sources, use the citations or search_results fields returned by Perplexity instead of trying to force links into the structured JSON.

Testing the flow

A simple way to test this integration is:

  1. trigger a real or test Sahha event
  2. confirm your webhook receives the payload
  3. log the mapped Sahha data
  4. send it to Perplexity
  5. inspect the returned JSON before saving it

You should also test:

  • missing fields
  • malformed payloads
  • duplicate webhook deliveries
  • Perplexity timeout or non-200 responses

Production notes

Before shipping this flow, make sure you:

  • verify webhook authenticity
  • implement retries and idempotency
  • avoid passing unnecessary personal data
  • log failures without exposing sensitive data
  • cache or debounce repeated outputs if your webhook volume is high

Example use cases

You can reuse this pattern for:

  • daily readiness summaries
  • recovery-based engagement copy
  • behavioural nudges
  • personalised onboarding follow-ups
  • coach-facing or admin-facing summaries

Next step

Once this is working, the next improvement is usually to standardise multiple output types.

For example, you might create separate prompt templates for:

  • user-facing summaries
  • coach-facing insights
  • high-risk flags
  • weekly recaps

That lets the same Sahha webhook power multiple downstream experiences.