Inside ELCA’s Intelligent Search

nvisia is an award-winning technology innovation and modernization partner driving competitive edge for industry-leading companies.

Architecting Responsible AI for Context and Trust

When most organizations talk about “AI search,” they mean a smarter query box. When the Evangelical Lutheran Church in America (ELCA) approached nvisia, they needed something deeper — an intelligent discovery engine that could connect thousands of congregations and seekers with faith resources accurately, contextually, and responsibly.


This wasn’t just a data challenge. It was a matter of precision and trust.

The Problem Beneath the Query

ELCA’s digital ecosystem spanned multiple websites and content systems — from Webflow and WordPress to proprietary directories — each with its own schema, tone, and governance. Traditional keyword search couldn’t interpret intent or context across these silos.

The result? Duplicate results, irrelevant matches, and a frustrating user experience that obscured rather than illuminated valuable content. For a mission-driven organization, that disconnect carried real consequences: lost engagement, reduced giving, and a weakened digital bridge to faith.

The solution required an AI architecture that could unify data, interpret natural language, and uphold theological integrity — all without sacrificing speed, scale, or security.

Architecting the Solution

To achieve this, we designed a multi-layer intelligent search platform on Microsoft Azure AI Foundry — combining semantic retrieval, generative AI, and continuous evaluation to deliver precision with accountability.

Data Integration & Storage

We began by consolidating six disparate content sources into a unified knowledge base.

  • APIs & Ingestion Layer: Data pipelines were established from Xano (which pulled information from Snowflake, its underlying data source), Webflow, and WordPress APIs, with an ingestion app performing daily ETL (extract, transform, load) operations.
  • Custom Data Storage: Content was normalized and stored in a semantic retrieval layer optimized for Azure AI Search indexing.

This ensured complete coverage and a consistent data model across thousands of resources — from sermons to administrative documents.

Semantic Search Workflow

At runtime, user prompts travel through a carefully orchestrated pipeline:

  • Retriever (Azure AI Search): Executes high-speed semantic search across indexed content.
  • Orchestrator (Azure ML Prompt Flow): Interprets intent, retrieves relevant documents, and blends deterministic retrieval with generative reasoning.
  • Large Language Model (Azure OpenAI): Produces synthesized, context-aware responses.
  • Evaluator Model: Independently scores output for groundedness and theological alignment before returning it to the user interface.

This layered approach enables intent recognition, contextual relevance, and traceable reasoning — key requirements for responsible generative AI.

Trust & Safety

AI doesn’t inspire confidence unless it can be trusted. To that end, we implemented:

  • Azure Content Filtering to block harmful categories (violence, hate, sexual content) and prompt attacks.
  • Faith-Informed Prompt Engineering to maintain tone, accuracy, and doctrinal soundness.
  • Continuous Evaluation Loops logging every interaction to refine model behavior over time.

The combination created a feedback system that learns responsibly — improving quality without introducing bias or drift.

User Experience & Performance

A React-based front end provides conversational and traditional search modes, tuned for accessibility and inclusivity.

  • Performance Optimization: Scalable VM sizing, and reranking algorithms sustain response times at scale.
  • Accessibility: Inclusive design principles ensure users of varying digital fluency can interact seamlessly with the platform.

Results in Practice

Within months, ELCA’s new intelligent search platform ingested over 40,000 documents and pages across six digital properties, making faith resources searchable through natural language and semantic relationships.

Four core user journeys were supported out of the gate — Find a Congregation, Find a Minister, Give to ELCA, and Grow in Knowledge and Faith — each powered by contextual AI understanding rather than static keyword logic.

The system now delivers 24/7 availability, rapid response rates, and verifiable accuracy through continuous evaluation. The architecture is extensible for future use cases, including multilingual expansion (English/Spanish) and cross-ministry integration.

Why This Matters

This project demonstrates what happens when AI is treated as an architectural discipline, not a magic trick. The same principles that govern strong enterprise systems — clarity, scalability, governance, and empathy — are what make intelligent search sustainable.

For ELCA, this architecture did more than improve user experience; it reinforced the church’s mission by connecting technology with theology. (Read the full case study here.)

For nvisia, it reaffirms that responsible AI begins not with models, but with design intention.

Takeaway

Whether your organization manages public knowledge, proprietary data, or customer self-service, the pattern remains: Semantic discovery succeeds when it’s architected for context, safety, and scale.

If your users can’t find what they need — or don’t trust what they find — it’s time to rebuild the foundation.

👉 Let’s architect your intelligent search strategy.

Related Articles