Off-the-shelf CRMs leave real estate teams managing leads in spreadsheets, chasing down MLS data from three separate windows, and emailing agents PDF reports by hand every Monday. We built something better. Here is exactly how we architect custom real estate dashboards — from the database layer to the map overlays to the automated drip sequences — using React, Supabase, and a purpose-built integration stack.
Every mid-sized real estate operation we have worked with arrived with the same complaints: the existing CRM could not talk to their MLS feed, lead routing rules were buried in three menus, and generating the weekly broker report meant exporting four different CSV files and stitching them together in Excel. The pain was not a lack of tools — it was a surplus of disconnected ones.
Generic platforms like Salesforce, Zoho, or even purpose-built real estate tools such as Follow Up Boss or BoomTown are powerful, but they are built for the average brokerage's average workflow. The moment a client wanted to trigger a text message when a lead opened their third email, correlate showing activity against conversion rate by zip code, or give each agent a personalized dashboard with only their own pipeline — the platform hit a wall.
Custom development removes that ceiling entirely. What follows is a technical walkthrough of the architecture we deployed for a recent project: a full real estate operations portal serving agents, brokers, and administrators — built from scratch, deployed to production, and actively handling hundreds of leads per day.
The core requirement was a dashboard that always felt live — lead status changes made by one agent needed to be visible to their broker in under a second without anyone hitting refresh. For this we chose React 18 with Zustand for client-side state management and Supabase as the backend platform, leveraging its PostgreSQL foundation alongside the Realtime websocket layer built on top of it.
Supabase Realtime broadcasts row-level changes via PostgreSQL logical replication. Every agent's active lead list, every status update, and every document upload event emits a change payload that the front end receives and applies to local state immediately — no polling, no stale data. The architecture looks like this at a high level:
// Supabase Realtime subscription — agent lead list
const channel = supabase
.channel('leads-agent-feed')
.on('postgres_changes', {
event: '*',
schema: 'public',
table: 'leads',
filter: `assigned_agent_id=eq.${agentId}`
}, (payload) => {
useLeadStore.getState().applyServerPatch(payload);
})
.subscribe();The database schema was carefully normalized. Leads, properties, showings, drip sequences, and documents live in separate tables with foreign key relationships and row-level security (RLS) policies that enforce role-based access at the database layer — not just at the UI layer. This is a critical distinction: even if a front-end routing bug exposed the wrong route, the database will refuse to return data that the authenticated user does not have permission to read.
Security note: All RLS policies were written using Supabase's auth.uid() function to bind permissions to the authenticated session token — never a client-supplied user ID. Agent-level policies restrict reads and writes to rows where assigned_agent_id = auth.uid(), while broker policies use a JOIN to the team membership table.
MLS data is messy. Different regional MLS boards use different RETS (Real Estate Transaction Standard) or RESO Web API endpoints, return fields with inconsistent naming conventions, and update their listings on unpredictable intervals. The first phase of development involved writing a normalization pipeline that could ingest data from the client's MLS provider and map it into a clean internal schema.
The ingestion pipeline runs as a Supabase Edge Function on a cron schedule, pulling delta updates from the MLS RESO Web API every 15 minutes. Each incoming property record is run through a normalization transformer before upsert:
// Normalization transformer (simplified)
function normalizeListing(raw: MLSRecord): NormalizedProperty {
return {
mls_id: raw.ListingKey,
address: toTitleCase(raw.UnparsedAddress),
city: raw.City,
state: raw.StateOrProvince,
zip: raw.PostalCode,
price: parseFloat(raw.ListPrice),
bedrooms: parseInt(raw.BedroomsTotal ?? '0'),
bathrooms: parseFloat(raw.BathroomsTotalDecimal ?? '0'),
sqft: parseInt(raw.LivingArea ?? '0'),
status: mapMLSStatus(raw.StandardStatus),
geo: raw.Latitude && raw.Longitude
? `POINT(${raw.Longitude} ${raw.Latitude})`
: null,
raw_payload: raw,
updated_at: new Date().toISOString()
};
}
The geo field is stored as a PostGIS GEOGRAPHY(Point, 4326) type, enabling efficient spatial queries for the interactive map overlay — for example, finding all active listings within a 5-mile radius of a given coordinate pair without pulling the entire dataset to the client.
De-duplication was handled with a unique constraint on mls_id combined with an ON CONFLICT DO UPDATE upsert strategy. Price change history is preserved in a separate property_price_history table via a PostgreSQL trigger, giving agents a complete timeline of listing activity without any application-layer overhead.
The lead pipeline was built as a Kanban-style board with configurable stages defined per brokerage. Each lead card carries a stage, a source (web form, referral, MLS inquiry, drip response, etc.), a lead score computed server-side, and a full activity timeline. Drag-and-drop stage transitions are optimistically applied in Zustand state and then confirmed via a Supabase RPC call, so the UI never feels laggy even on slower connections.
For funnel analytics, we built a dedicated aggregation layer using materialized views in PostgreSQL. Rather than running heavy GROUP BY queries against the live leads table on every dashboard load, a scheduled job refreshes materialized views every 30 minutes. The dashboard pulls from these pre-aggregated views, which return in sub-50ms.
Lead-to-showing and showing-to-offer conversion rates by agent, source, and zip code — refreshed every 30 minutes from materialized views.
Tracks elapsed time from lead creation to first agent contact, with automated escalation alerts if the threshold is exceeded.
UTM parameter capture at lead creation maps every conversion back to the originating campaign, ad, or organic channel.
A weighted scoring algorithm runs server-side on each lead activity event — email opens, page views, form submissions — and updates the score in real time.
Lead score is a particularly useful signal. We trained a simple weighted model using historical deal data: a lead that opens an email, clicks a property link, and returns to the site within 24 hours scores in the top decile and triggers an immediate high-priority notification to their assigned agent. Low-scoring leads route to a long-cadence nurture sequence instead of consuming agent bandwidth.
Behavioral automation is where custom builds genuinely outperform off-the-shelf CRMs for sophisticated teams. Rather than setting up time-based sequences ("send email 3 days after signup"), the system we built triggers communications based on what the lead actually does.
Each lead action — an email open, a clicked property link, a portal login, a document view, a returned inbound call — is recorded as an event row in the lead_events table. A Supabase Edge Function subscribed to this table evaluates each incoming event against a set of trigger rules stored in the automation_rules table. When a rule fires, it enqueues a message job.
// Trigger evaluation (Edge Function)
const matchedRules = allRules.filter(rule => {
return rule.trigger_event === event.type
&& rule.conditions.every(c => evaluateCondition(c, lead, event));
});
for (const rule of matchedRules) {
await enqueueMessage({
type: rule.message_type, // 'email' | 'sms'
lead_id: lead.id,
template: rule.template_id,
delay_ms: rule.delay_minutes * 60_000,
metadata: { rule_id: rule.id, event_id: event.id }
});
}Email delivery uses Resend with React Email templates, giving the team pixel-perfect HTML emails that render consistently across clients. SMS goes through Twilio with E.164 phone number normalization on the way in and opt-out compliance handled automatically via Twilio's opt-out keywords. Every sent message is logged back to the lead's activity timeline, so an agent reviewing a lead card sees the full conversation history across both channels in chronological order.
All drip sequences respect a global suppression list and honor CAN-SPAM / TCPA compliance rules enforced at the enqueue layer — not left to individual template authors. Unsubscribe events update a boolean column on the lead record and immediately halt all future sends for that lead, with no manual intervention required.
Three distinct user roles drive fundamentally different experiences within the same application. The role system was designed as a first-class concern rather than an afterthought bolted onto a single-user interface.
Sees only their assigned leads, their own pipeline, their showing calendar, and their individual performance metrics. Cannot view team-level financials or other agents' leads.
Full visibility across all agents on their team — pipeline totals, conversion funnels, lead source attribution, and document review for compliance. Can reassign leads between agents.
Cross-brokerage visibility, user provisioning, automation rule management, MLS integration settings, and system-level reporting. Full audit log access.
The RLS policies in Supabase enforce these boundaries at the database layer. The React application then uses a usePermissions() hook that reads the authenticated user's role from the JWT claims and conditionally renders UI elements. This two-layer enforcement — database + UI — ensures that even if a front-end bug exposes an admin route to an agent, the database simply returns no rows for the unauthorized query.
The map view was one of the most requested features and one of the more technically satisfying components to build. Using Mapbox GL JS layered on top of the PostGIS-powered property dataset, the map renders all active and pending listings as clustered pins that expand on zoom, color-coded by status.
Spatial queries run server-side. When the agent pans or zooms the map, the React component debounces the new bounding box coordinates and fires a PostGIS query against the normalized properties table:
-- Properties within current map viewport
SELECT id, address, price, status, bedrooms,
ST_X(geo::geometry) AS lng,
ST_Y(geo::geometry) AS lat
FROM properties
WHERE ST_Within(
geo::geometry,
ST_MakeEnvelope($1, $2, $3, $4, 4326)
)
AND status IN ('Active', 'Pending')
LIMIT 500;Clicking a pin opens a flyout panel with full listing details, lead history for any leads associated with that address, showing availability, and a direct link to the document folder. Agents in the field on mobile can use this map to immediately identify nearby listings during a showing tour, see which ones have pending offers, and pull up the contact history for any interested buyers in their pipeline — without leaving the dashboard.
Leads themselves are also plotted on a separate "heat map" layer using their saved search area or most recently viewed properties, giving brokers a geographic visualization of where buyer interest is concentrated — useful for listing strategy and resource allocation across large metro coverage areas.
Every real estate transaction generates a significant volume of documents: purchase agreements, counter-offers, disclosures, inspection reports, title commitments, and closing statements. The portal needed to handle all of it without requiring agents to leave the platform or manage files in a separate Dropbox or Google Drive.
Files are stored in Supabase Storage buckets with per-object RLS policies. Each lead has an associated document folder. Uploads are handled directly from the browser to Supabase Storage via a signed upload URL — the file never touches our application server, keeping upload speeds fast and reducing infrastructure surface area.
For e-signature workflows, the platform integrates with a document signing API. When an agent uploads a contract and marks it for signature, the system creates a signature envelope, sends signing request emails to the required parties, and listens for webhook events to update the document's status (draft, sent, partially signed, completed) in real time. Completed signed documents are automatically moved to a "Completed" subfolder and the lead's timeline receives a timestamped entry.
Compliance detail: All document uploads are versioned. Overwriting a file creates a new version row in the document_versions table rather than replacing the original — ensuring a full audit trail for compliance reviews and dispute resolution.
Eliminating manual reporting was one of the highest-ROI outcomes of this build. Previously, a staff member spent several hours each Monday compiling a broker performance report from multiple system exports. After deployment, that process is fully automated.
PDF generation uses a headless Chromium instance via Puppeteer running in a Node.js Edge Function. The report template is a React component rendered to HTML with all the data injected server-side, then captured as a pixel-perfect PDF. The output quality is indistinguishable from a manually designed report and includes charts, tables, and branded headers.
Weekly digest emails are scheduled via a Supabase pg_cron job that fires every Sunday at 11 PM. The job assembles the previous week's metrics for each broker — new leads, contacts made, showings scheduled, offers submitted, closings, and pipeline value change — then triggers an Edge Function that renders the digest template and dispatches it through Resend. Each broker receives a report scoped only to their own team's data.
Brokers can generate a full team performance report as a PDF at any time with one click — rendered server-side in under 4 seconds.
Every broker receives a rich HTML email digest every Monday morning with the previous week's KPIs, zero manual work required.
Triggered alerts for high-priority leads, document completions, missed follow-up deadlines, and offer expirations send immediately on event.
Any filtered lead view or pipeline report can be exported to CSV from the dashboard — useful for tax records, compliance, and external analysis.
Real estate agents do not sit at desks. The entire portal was designed mobile-first, with the assumption that a large percentage of active sessions would come from an iPhone on LTE while an agent is between showings.
The layout uses CSS Grid with minmax() column definitions so panels reflow gracefully from a three-column desktop layout to a single-column stacked mobile layout without any JavaScript-driven breakpoint logic. Touch targets are sized to a minimum of 44px per Apple's HIG and WCAG 2.1 AA guidelines. The lead pipeline Kanban board collapses to a swipeable card stack on mobile, with swipe-left to dismiss and swipe-right to advance stage.
The map component uses Mapbox's built-in touch gesture handling, and the property flyout panel slides up from the bottom of the screen on mobile — following native iOS sheet patterns that agents already understand intuitively. Document uploads on mobile use the native file picker, which on iOS surfaces the option to scan a physical document directly with the camera — a workflow agents use frequently for scanning physical signatures or inspection notes on site.
Load time on mobile was treated as a hard constraint. The React bundle is code-split by route using dynamic imports, so the initial JS payload for the login screen is under 80kb gzipped. Heavier components — the map, the PDF viewer, the analytics charts — load only when navigated to. Combined with Supabase's edge-cached REST layer, the dashboard's time-to-interactive on a typical 4G connection is consistently under 2.5 seconds.
The front end deploys to Netlify with atomic deploys and branch previews for staging. The Supabase project runs on a dedicated instance with point-in-time recovery enabled, daily automated backups, and connection pooling via PgBouncer. Edge Functions deploy independently from the main app, meaning drip sequence logic or report generation can be updated without a full front-end release.
Observability was built in from day one rather than retrofitted. Error tracking uses Sentry with custom context tagging — every error report includes the user's role, current route, and last three actions taken. Structured logs from Edge Functions flow into a Supabase Log Drain, making it straightforward to diagnose issues like a failed MLS sync or a stalled signature workflow without connecting to a production console.
The client's team owns the codebase outright. We deliver a fully documented repository, a Notion-based architecture wiki, and — critically — the application is built on open standards (PostgreSQL, React, standard REST and WebSocket APIs) rather than vendor-specific abstractions that would make future in-house development difficult. The goal of every custom build we deliver is that the client's own developers could maintain and extend it confidently on day two.
Custom real estate dashboards start at $4,999 depending on the feature set. Systems with MLS integration, role-based access, and PDF reporting are custom-quoted.
Yes — we use the RESO Web API to ingest, normalize, and sync MLS data into custom PostgreSQL databases with 15-minute refresh cycles.
We are based in Central Minnesota and have delivered custom real estate platforms for clients across Minnesota and nationwide.
We use React 18 with TypeScript, Supabase (PostgreSQL + Realtime), PostGIS for spatial queries, Mapbox GL JS for maps, and Supabase Edge Functions for backend logic.
Whether you are a growing brokerage needing a full-stack lead-to-close platform or a property management company wanting a client-facing portal with real-time data, we build it from the ground up — on your timeline, to your exact spec.