Search for big data using LLMs

Problem

In 2020, the company’s big data SaaS platform enabled large-scale document processing and answer extraction from financial and legal content.

However, extracted answers were siloed within individual projects, making it difficult for users to find consistent information about a customer or topic across different workstreams.

This created workflow inefficiencies and frequent context switching.

Solution

Led the design of a global search experience that allowed users to:

  • Search extracted answers across all projects

  • Filter results by:


    • Document metadata (e.g. customer name, document type)

    • Confidence scores

    • Project


  • View answers in context, linking directly to the source document

Collaborated with engineering and product teams to align on scope and MVP priorities.

Outcome

Team alignment on new feature that aimed to:

  • Improved user efficiency and retention, especially for enterprise users managing large-scale portfolios

  • Supported product stickiness by reducing friction in daily workflows

Filtered results
Annotated docs
Smart filters
Filtered results
Annotated docs
Smart filters
Annotated docs
Smart filters
Filtered results
Annotated docs
Smart filters
Filtered results
Annotated docs
Smart filters
Annotated docs
Smart filters
Filtered results
Annotated docs
Smart filters
Filtered results
Annotated docs
Smart filters
Annotated docs
Smart filters

Email us

hello@swirlypeak.com

Email us

hello@swirlypeak.com

Email us

hello@swirlypeak.com