Skip to main content
Founder projectPublic productAI search

ThaiCondoAI

A public condo-search product for Thailand that combines semantic search, map discovery, authentication, and monetization.

Founder / Full-Stack Engineer

ThaiCondoAI real estate search project visual
5k+
active users
pgvector
semantic search core
Stripe
monetization

Work overview

A quick read on where the work sat, what made it hard, what I owned, and what changed.

Context

ThaiCondoAI started from a practical problem in Thailand property search: listings are easy to collect but hard to compare. The useful part is not another AI chat box; it is helping someone describe what they want, then bringing back listings they can still inspect, filter, and place on a map.

What was hard

Keyword search breaks down when users describe lifestyle, commute, location feel, or tradeoffs instead of exact listing terms. At the same time, real-estate UX cannot hide behind AI output; users still need visible listings, prices, locations, and trust signals.

My part

I owned the product end to end: search relevance, map UX, Supabase data access, auth, Stripe monetization, Playwright checks, Docker workflows, and Vercel deployment.

What changed

The product now has 5k+ active users and a stack built around semantic retrieval, map-based exploration, protected data access, and paid search flows.

Constraints that mattered

  • Listing text is messy, repetitive, and often written for portals rather than human comparison.
  • AI costs had to stay controlled; not every click should trigger generation.
  • Search and map results had to agree, otherwise the product feels clever but unreliable.
  • Auth, RLS, and payments needed clear boundaries before public traffic grew.

Approach

01

Use AI where keywords fail

Embeddings and pgvector handle fuzzy intent, while normal filters and listing fields stay visible for comparison.

02

Keep the map in the decision loop

A result can match the query and still be wrong for the user if the location does not work. The map keeps that tradeoff visible.

03

Build public-product plumbing early

Supabase RLS, Stripe, tests, Docker, and deployment checks were treated as part of the product, not cleanup after the UI was done.

Key decisions

A few engineering choices that shaped the implementation and delivery path.

Decision

Keep vectors close to the relational data with PostgreSQL and pgvector

Reason

The product needed semantic search without splitting listing data, user data, and ranking logic across too many moving parts.

Impact

Search stayed close to the data model, which made relevance tuning and access rules easier to reason about.

Decision

Do not make the product a pure chat interface

Reason

Property decisions depend on comparison: location, price, size, surrounding area, and confidence in the underlying listing.

Impact

AI improves discovery without taking away the listing and map surfaces people need to make a decision.

Decision

Test the paths that can lose trust

Reason

Auth, payment, and search flows are where a small bug quickly becomes a product problem.

Impact

Playwright checks and clearer boundaries made those flows less fragile as the product grew.

What I did

Built

  • Semantic property search with OpenAI embeddings, PostgreSQL, and pgvector.
  • Map-based discovery using Mapbox with location-aware browsing flows.
  • Authentication, RLS-backed data access, Stripe monetization, and Vercel deployment.

Improved

  • Search relevance by separating user intent from literal listing keywords.
  • Product trust through cleaner result surfaces, tests, and deployment workflows.
  • Operational confidence with Playwright coverage and Docker-based local workflows.

Owned

  • Founder-level product decisions across UX, monetization, technical architecture, and shipping.
  • End-to-end implementation from database structure to public web surface.
  • Tradeoffs between AI features, cost control, speed, and discoverability.

Semantic real-estate search stack

The public product combines normal product infrastructure with an AI retrieval layer and map-first user experience.

01

Web Experience

Next.js search, result, auth, and map surfaces optimized for discovery.

02

Search Intelligence

OpenAI embeddings, query interpretation, and pgvector similarity search.

03

Data Platform

PostgreSQL, Supabase RLS, Redis, and listing/search persistence.

04

Business Layer

Stripe monetization, Playwright checks, Docker workflows, and Vercel deploys.

Learnings

  • AI search is useful only when the product still feels faster and clearer than a normal listing site.
  • Founder projects need boring operational systems early: tests, auth, billing, and deployment proof.
  • Map UX works best when semantic intent and location constraints cooperate instead of competing.