K
Watchlist
← Dealbook
Granola logoGR

Granola

Horizontal AI
A
5 risks

Granola is positioning as a series c horizontal AI infrastructure play, building foundational capabilities around rag (retrieval-augmented generation).

www.granola.ai
series cGenAI: coreLondon, United Kingdom
$125.0Mraised
20KB analyzed10 quotesUpdated Mar 31, 2026
Event Timeline
Why This Matters Now

As agentic architectures emerge as the dominant build pattern, Granola is positioned to benefit from enterprise demand for autonomous workflow solutions. The timing aligns with broader market readiness for AI systems that can execute multi-step tasks without human intervention.

Granola is an AI notepad that transcribes, summarizes, and organizes conversations into structured notes.

Core Advantage

A combined product and technical stack: a notepad-first UX that fuses user-written notes with high-quality transcripts and LLM reasoning, plus MCP to expose that structured meeting context to other AI models and apps — all delivered with enterprise-grade privacy and offline capabilities.

Build SignalsFull pattern analysis

RAG (Retrieval-Augmented Generation)

4 quotes
high

Granola stores transcripts and AI summaries, indexes meetings by people/companies, and uses that historical context when answering cross-meeting questions — a textbook retrieval + generation pipeline (embeddings/vector or document store + LLMs).

What This Enables

Accelerates enterprise AI adoption by providing audit trails and source attribution.

Time Horizon0-12 months
Primary RiskPattern becoming table stakes. Differentiation shifting to retrieval quality.

Micro-model Meshes (multi-model orchestration)

3 quotes
high

They explicitly route meeting context to multiple external LLMs/providers via a Model Context Protocol, suggesting an orchestration layer that can route tasks/context to different models (ensembles, provider selection, or task-specific model usage).

What This Enables

Emerging pattern with potential to unlock new application categories.

Time Horizon12-24 months
Primary RiskLimited data on long-term viability in this context.

Continuous-learning Flywheels

3 quotes
medium

User transcripts, edits and usage appear to feed internal model improvement (anonymized training), forming a product-data-model feedback loop that improves summaries, prompts/recipes, and search over time.

What This Enables

Winner-take-most dynamics in categories where well-executed. Defensibility against well-funded competitors.

Time Horizon24+ months
Primary RiskRequires critical mass of users to generate meaningful signal.

Knowledge Graphs / Entity-Centric Indexing

3 quotes
medium

Granola builds entity-aware indexes (people, companies, meetings) enabling entity-centric queries and cross-note linking — functionally similar to a permission-aware knowledge graph even if not explicitly called a graph DB.

What This Enables

Emerging pattern with potential to unlock new application categories.

Time Horizon12-24 months
Primary RiskLimited data on long-term viability in this context.
Technical Foundation

Granola builds on Claude, ChatGPT, Cursor, leveraging OpenAI and Anthropic infrastructure with Model Context Protocol in the stack. The technical approach emphasizes hybrid.

Model Architecture
Primary Models
OpenAI (GPT family - implied via ChatGPT references)Anthropic (Claude)Claude (explicit)ChatGPTCursor (mentioned)Deepgram (ASR)Assembly (ASR)
Fine-tuning

Not specified in content. Granola states they "train on your anonymized data" for product improvement. Could include centralized fine-tuning or internal model training pipelines, but no explicit mention of LoRA, RLHF, or full fine-tuning. — Anonymized user meeting transcripts and notes stored in Granola (with opt-out controls and enterprise defaults)

Compound AI System

Granola exposes meeting context to external models/tools using an explicit Model Context Protocol (MCP). They also combine ASR providers with LLM providers in the end-to-end pipeline (transcribe -> summarize -> store -> export). There is no public evidence of internal multi-model chaining (e.g., model A calls model B with structured handoffs) beyond the context-export/third-party integration layer.

Inference Optimization
client-side audio capture and ephemeral caching to reduce raw audio persistenceoffline-capable local indexing and cached data for fast searchAPI-level gating of only-processed artifacts to avoid repeated/partial inferencerate-limiting to protect inference capacity and ensure stability
Team
Founder-Market Fit

insufficient information about founders; no identifiable founder profiles or backgrounds provided in the material

Engineering-heavyML expertiseDomain expertiseHiring: engineers (Product Engineer, Security Engineer, Android Engineer, Design Engineer)Hiring: CTO/Head of Engineering or equivalentHiring: GTM roles (Head of Sales, Account Executives, SDRs, Customer Success)Hiring: Product/UX roles (Product Manager, Design Engineer, UX/UI)
Considerations
  • • Founders and core leadership identities are not disclosed in the provided materials, limiting assessment of founding experience and domain fit
  • • Heavy emphasis on marketing and product stories with limited public traces of founders or leadership biographies in the supplied content
Business Model
Go-to-Market

product led

Target: enterprise

Pricing

freemium

Free tierEnterprise focus
Sales Motion

hybrid

Distribution Advantages
  • • Strong integrations ecosystem (Zapier, Slack, Notion, Attio, Hubspot, etc.)
  • • Multi-platform availability (Windows, macOS, iOS, Android)
  • • Security and governance features (SOC2, SSO, admin controls, org-wide policies)
  • • Team collaboration features (Team Folders, Shared with me), enabling network effects
Customer Evidence

• Brex founder/CEO quote praising Granola

• Investor and influencer mentions (Betaworks, Firstminute)

Product
Stage:general availability
Differentiating Features
Granola MCP: integration to Claude, ChatGPT, Cursor and other AI tools via Model Context ProtocolTeam-level context across meetings and long-form conversations (Granola 2.0) as a second brainGranola for Windows and iOS, plus deep calendar integrations and calendar-aware suggestionsSOC2 compliance and enterprise-grade admin controls, including org-wide auto-deletion and SSODepth of integrations (Attio, Notion, Slack, Hubspot, Affinity, Zapier) and upcoming 8k+ app connectivity via Zapier
Integrations
SlackNotionHubspotAttioAffinityZapier
Primary Use Case

Capture, transcribe, and structure meeting content into AI-enhanced notes with context across meetings

Novel Approaches
Competitive Context

Granola operates in a competitive landscape that includes Otter.ai, Fireflies.ai, Grain.

Otter.ai

Differentiation: Granola emphasizes a notepad-like UX, transcribes system audio without adding bots, provides richer AI-enhanced notes (templates, Recipes, Chat across meetings), stronger team/workspace features and explicit model-training opt-outs and enterprise controls.

Fireflies.ai

Differentiation: Granola focuses on a lightweight editor experience (Apple Notes-like), real-time in-meeting assistance, offline-first desktop/iOS apps, MCP (Model Context Protocol) to push structured meeting context into other LLMs, plus stronger privacy controls and SOC2 compliance positioning.

Grain

Differentiation: Granola targets continuous note-taking for back-to-back meetings (not just highlight clips), integrates richer AI workflows (Recipes, cross-meeting chat), stores structured summaries and templates rather than focusing primarily on video snippets and shareable clips.

Notable Findings

Local, bot-free transcription on desktop: Granola captures both the microphone and system audio without adding a meeting bot. That implies an OS-level audio-capture strategy (virtual audio device / loopback or privileged audio APIs), low-latency streaming to transcription providers, and careful handling of platform audio routing differences across macOS, Windows, and iOS. This is atypical vs. the industry norm of joining a call with a bot.

Hybrid on-device / cloud transcription model: claims of real-time desktop transcription and temporarily cached iOS audio suggest a split pipeline — likely streaming to cloud STT (Deepgram/Assembly) for accuracy while minimizing persistent audio storage. Engineering tradeoffs: ephemeral local caching, strict lifecycle controls, and coordination to ensure no audio is persisted beyond processing.

Provider-agnostic polyglot pipeline for STT and LLMs: they explicitly call out multiple vendors (Deepgram, Assembly, OpenAI, Anthropic) and MCP to connect to Claude/ChatGPT/Cursor. That points to dynamic routing (selecting provider based on language, latency, cost, or privacy constraints), response normalization, and a unified intermediate representation (speaker segments, timestamps, entities) used across models.

Model Context Protocol (MCP) adoption / composability: Granola exposes structured meeting context to third-party LLMs via MCP. This is a product-level protocol to make meeting context portable to other models/tools — a convergence on making context-first primitives available rather than embedding monolithic LLM logic inside the app.

Fine-grained transcript editing and cryptographic/operational deletion: 'delete parts of a transcript' implies segment-indexed transcripts, immutable change logs for auditability, and secure erasure semantics across backups and downstream vector indices (necessitating index re-ingestion and differential deletion propagation).

Risk Factors
Wrapper Riskmedium severity
Feature, Not Producthigh severity
No Clear Moatmedium severity
Overclaimingmedium severity
What This Changes

If Granola achieves its technical roadmap, it could become foundational infrastructure for the next generation of AI applications. Success here would accelerate the timeline for downstream companies to build reliable, production-grade AI products. Failure or pivot would signal continued fragmentation in the AI tooling landscape.

Source Evidence(10 quotes)
“Granola MCP connects your meeting notes to Claude, ChatGPT, Cursor, and other AI tools via Model Context Protocol.”
“Granola uses best-in-class transcription providers (like Deepgram and Assembly) and AI providers (like OpenAI and Anthropic) to summarize your meeting.”
“Introducing Granola MCP”
“Ask Granola during a meeting”
“The AI Notepad for back-to-back meetings”
“Model Context Protocol (MCP) as a first-class feature: shipping a context-sharing protocol that pushes meeting context to arbitrary external LLMs/providers — enables model-agnostic orchestration and context portability.”