Skip to main content

Global Topics Management

The Global Topics page is the central command center for managing the master taxonomy catalog. Every topic classified in AudienceGPT exists first as a global topic in the topics table, and organizations adopt these into their own libraries as org topics (stored in org_topics with a foreign key back to the global record). This two-tier architecture means that improvements to global topics -- reclassification, name cleanup, taxonomy reassignment -- automatically benefit every organization that has adopted them.

This guide covers browsing, searching, editing, reclassifying, archiving, and auditing global topics from the super admin panel.

Global vs Org Topics

Understanding the relationship between global and org topics is fundamental to admin operations.

AspectGlobal Topics (topics)Org Topics (org_topics)
ScopeSystem-wide, shared across all orgsScoped to a single organization
OwnershipManaged by super adminsAdopted by orgs from the global catalog
Classification dataFull 7-layer classification, embeddings, platform outputsReferences global_topic_id for classification
MutationsReclassify, merge, archive, tree mappingLink/unlink, activation, performance tracking
Deduplicationpgvector embedding similarity (95% block)Checked at adoption time via global_topic_id
Engine versionStamped with ENGINE_VERSION at classificationInherits from global topic
DB-to-TypeScript Field Mapping

When reading code or API responses, remember the field name swap at the storage boundary:

  • DB column taxonomy_type (41 specific types like "Auto", "Business Technology") maps to TypeScript field parentCategory
  • DB column parent_category (13 broad groups like "Technology & Telecom") maps to TypeScript field taxonomyType

This swap is handled automatically by rowToTopicRecord() in the storage layer, but raw SQL routes (including admin routes) perform the swap manually in their response mappers.

Topic Browser

Navigate to Admin > Topics to access the global topics browser.

Global Topics Browser

Searching and Filtering

The browser supports several filter dimensions:

FilterQuery ParameterDescription
SearchsearchCase-insensitive substring match on topic_name
Parent CategoryparentCategoryFilter by one of the 41 parent categories (DB: taxonomy_type)
Segment TypesegmentTypeFilter by B2B, B2C, B2B2C, B2E, or B2G
ArchivedshowArchivedToggle to view archived topics instead of active ones

Sorting

Click any column header to sort. Supported sort columns:

  • topic_name -- Alphabetical by topic name
  • parent_category -- By the 41-type parent category
  • taxonomy_type -- By the 13-group taxonomy type
  • segment_type -- By segment classification
  • created_at -- By creation date (default, descending)
  • performance_score -- By performance metric
  • audience_type -- By audience type classification

Pagination

Results are paginated with a configurable page size (1--100, default 25). The API returns total, page, and pageSize in every response for client-side pagination controls.

Viewing Topic Details

Click any topic row to open the detail panel. The detail view displays:

  • Classification layers: All 7 classification layers (intent, intensity, awareness, segment, sensitivity, buyer journey, composite)
  • Platform outputs: Generated names and descriptions for Trade Desk, LiveRamp, Internal, and any custom templates
  • Taxonomy path: Full tree path if the topic is mapped to a taxonomy node
  • Engine version: Current version stamp and whether it is outdated
  • Keywords: Auto-generated classification keywords
  • IAB code: Associated IAB content taxonomy code
  • Audience type: Derived audience type (e.g., "In-Market Buyers", "Active Enthusiasts")

Editing a Single Topic

Super admins can edit individual topic fields directly from the detail panel.

Inline Field Editing

  1. Open a topic's detail panel
  2. Click the edit icon next to the field you want to change
  3. Modify the value and confirm
  4. The system regenerates dependent fields (platform names, descriptions, taxonomy hash)

The PATCH /api/admin/topics/[topicId] endpoint handles single-topic updates. It:

  • Validates the incoming field values
  • Regenerates platform outputs using the template engine when classification-related fields change
  • Updates the updated_at timestamp
  • Records the change in the global topic history audit trail
warning

Editing a global topic's parent_category or segment_type triggers a cascade regeneration of all platform output names and descriptions. This is by design -- platform names are derived from classification data via the template engine.

Bulk Add to Organizations

Super admins can seed topics into the global catalog in bulk, or push global topics to specific organizations.

Admin Bulk Seeding

The bulk seeding flow creates new global topics from a list of topic names:

  1. Navigate to Admin > Topics > Bulk Add
  2. Provide a list of topic names (paste or CSV upload)
  3. The system creates an import batch with org_id = 'system'
  4. Topics are classified in 500-row chunks via POST /api/admin/topics/bulk/[batchId]/chunk
  5. Monitor progress through the batch status endpoint

API Flow:

POST /api/admin/topics/bulk          -- Create batch, get batchId
POST /api/admin/topics/bulk/:batchId/chunk -- Process each chunk
GET /api/admin/topics/bulk/:batchId/status -- Poll progress
POST /api/admin/topics/bulk/:batchId/review-topics -- Post-import review

Pushing to Org Libraries

To make global topics available in an organization's library, users can:

  • Use the chatbot catalog browse feature (natural language: "add all automotive topics")
  • Use the catalog API directly: POST /api/topics/catalog with selected topic IDs

Both methods call neonLinkTopicsBatch() which creates org_topics records referencing the global topics.id.

Bulk Reclassify

Reclassification re-runs the classification pipeline on existing topics, updating their taxonomy, keywords, platform names, and engine version stamp.

Reclassify Panel

Reclassify Modes

ModeEngineSpeedCostWeb Search
Rule-based (Local)Deterministic regex engineFast (~50/sec)FreeNo
AI-powered (LLM)Claude Sonnet 4.6 via Batch APISlower (batch queued)$1.50--$7.50/M tokensConfigurable

Starting a Full Reclassify Job

  1. Navigate to Admin > Topics > Reclassify tab
  2. Select the mode (Rule-based or AI-powered)
  3. Configure options:
    • Force: Reclassify all topics, not just outdated ones
    • Limit: Cap the number of topics to process
    • Parent Category Filter: Only reclassify topics in a specific parent category
    • Web Search: Enable/disable web search for LLM mode (disabling saves ~50% cost)
  4. Review the cost estimate (shown for LLM mode)
  5. Click Start Reclassify

The system creates a background job of type reclassify and processes it in chunks. See Background Jobs for details on monitoring and management.

Single-Column Reclassify

For targeted corrections, you can reclassify just one field across all topics:

Target ColumnDescription
segment_typeB2B/B2C/B2B2C/B2E/B2G classification
taxonomy_type13-group taxonomy assignment
parent_category41-type parent category assignment
subcategorySubcategory label
audience_typeAudience type classification

Single-column reclassification uses the same Anthropic Batch API as full reclassify but with a focused prompt that asks only for the targeted field. This is significantly cheaper than a full reclassify.

To start a single-column reclassify:

  1. Select "Column Mode" in the reclassify panel
  2. Choose the target column from the dropdown
  3. Configure force/limit/filter options
  4. Start the job

LLM Reclassify Processing Phases

AI-powered reclassification uses a three-phase approach via the Anthropic Message Batches API:

  1. Phase 1 -- Submit: Collects all eligible topics, splits into sub-batches of 100, and submits each to the Anthropic Batch API (50% cost discount over real-time)
  2. Phase 2a -- Poll & Stage: Each CRON tick polls batch status. Completed batches are streamed into a batch_results_staging table. Time-bounded to 45 seconds per tick.
  3. Phase 2b -- Post-process: Staged results are processed in chunks of 500. Each result is validated, post-processed, and written to the topics table with history recording.
tip

For large reclassify jobs (10,000+ topics), the LLM batch approach is both cheaper and more reliable than real-time API calls. The sub-batch architecture means partial results are preserved even if a CRON tick times out.

Archiving Topics

Archiving soft-removes topics from the active catalog without deleting them.

Archive

  1. Select topics in the browser using checkboxes
  2. Click Archive Selected
  3. Confirm the action

API: POST /api/admin/topics/archive with { topicIds: string[] } (max 500 per request)

Archived topics:

  • Have archived_at set to the current timestamp
  • Are excluded from all standard queries (WHERE archived_at IS NULL)
  • Can still be viewed by toggling "Show Archived" in the browser
  • Retain their classification data and history

Restore

  1. Toggle "Show Archived" in the browser
  2. Select archived topics
  3. Click Restore Selected

API: DELETE /api/admin/topics/archive with { topicIds: string[] }

Both archive and restore operations are recorded in the global topic history audit trail.

Topic Statistics

The stats endpoint provides aggregate metrics for the global catalog:

API: GET /api/admin/topics/stats

Returns:

  • total -- Total active (non-merged, non-archived) topics
  • byParentCategory -- Count per parent category (41 types)
  • bySegmentType -- Count per segment type (B2B, B2C, etc.)

Topic Metrics

The metrics endpoint provides deeper analysis:

API: GET /api/admin/topics/metrics

Returns:

  • Current snapshot: Topic count, parent category coverage, segment distribution, audience types, taxonomy hash coverage, platform output coverage, keyword stats, description coverage
  • Per-version revisions: The same metrics broken down by engine version, showing how the catalog has evolved across classification engine updates

History and Audit Trail

Every change to a global topic is recorded in the topic_history table via the fire-and-forget recordGlobalTopicChange() / recordGlobalTopicChangesBatch() functions.

Viewing History

API: GET /api/admin/topics/[topicId]/history

Returns up to 200 entries ordered by changed_at DESC. Each entry contains:

FieldDescription
changeTypereclassified, merged, rolled_back, archived, restored
engineVersionBeforeEngine version before the change
engineVersionAfterEngine version after the change
previousValuesSnapshot of field values before the change
newValuesSnapshot of field values after the change
changedByUser ID who made the change
sourceOrigin: admin_reclassify_local, admin_reclassify_llm, admin_merge, admin_rollback, admin_ui
batchIdJob ID for batch operations (enables rollback)
tip

The batchId field links all history entries from a single job together. This is what makes job-level rollback possible -- the rollback handler queries all topic_history entries with the matching batchId and restores their previousValues.

Outstanding Issues Dashboard

The outstanding issues endpoint surfaces problems requiring admin attention:

API: GET /api/admin/topics/outstanding

Returns:

  • Review Queue: Topics with review_status = 'needs_review' (up to 20 with full details, plus total count)
  • Classification Errors: Recent reclassify jobs with errors, including topic-level error details

This endpoint powers the proactive issue surfacing in the classify chatbot when a super admin opens the page.

Next Steps