← Back to search
com.exoquery/mcp-server

com.exoquery/mcp-server

Kotlin compile-time SQL library. Docs, code validation, and SQLite execution tools.

Status
Healthy
Score
74.8
Transport
streamable-http
Tools
6

Production readiness

Verdict
Needs remediation
Current validation evidence shows operational or discovery gaps that should be fixed first.
Critical alerts
0
Production verdicts degrade quickly when critical alerts are active.

Evidence confidence

Confidence score
65.0
Based on 20 recent validations, 26 captured checks, and validation age of 605.0 hours.
Live checks captured
26
More direct checks increase trust in the current verdict.
Validation age
605.0h
Lower age means fresher evidence.

Recommended for

OpenAI connectors
OpenAI connectors is marked compatible with score 89.
Claude Desktop
Claude Desktop is marked compatible with score 83.
Smithery
Smithery is marked compatible with score 80.
Generic Streamable HTTP
Generic Streamable HTTP is marked compatible with score 100.

Client readiness verdicts

Ready for ChatGPT custom connector
Ready
Transport compliance should be in good shape.
Confidence: medium (65.0)
Evidence provenance
Winner: live_validation
Supporting sources: live_validation, history, server_card
Disagreements: none
  • initializeOK
  • tools_listOK
  • transport_compliance_probeError
  • step_up_auth_probeOK
  • connector_replay_probeOK — Frozen tool snapshots must survive refresh.
  • request_association_probeMissing — Roots, sampling, and elicitation should stay request-scoped.
Ready for Claude remote MCP
Ready
Transport behavior should match Claude-compatible HTTP expectations.
Confidence: medium (65.0)
Evidence provenance
Winner: live_validation
Supporting sources: live_validation, history, server_card
Disagreements: none
  • initializeOK
  • tools_listOK
  • transport_compliance_probeError
Unsafe for write actions
No
Current write surface is bounded enough for cautious review.
Confidence: medium (65.0)
Evidence provenance
Winner: live_validation
Supporting sources: live_validation, history
Disagreements: none
  • action_safety_probeWarning
Snapshot churn risk
Low
No material tool-surface churn detected in the latest comparison.
Confidence: medium (65.0)
Evidence provenance
Winner: history
Supporting sources: history, live_validation
Disagreements: none
  • tool_snapshot_probeOK
  • connector_replay_probeOK

Why not ready by client

ChatGPT custom connector
Ready
Remediation checklist
  • No explicit blockers recorded.
Claude remote MCP
Ready
Remediation checklist
  • No explicit blockers recorded.
Write-safe publishing
Ready
Remediation checklist
  • No explicit blockers recorded.

Verdict traces

Production verdict
Needs remediation
Current validation evidence shows operational or discovery gaps that should be fixed first.
Confidence: medium (65.0)
Winning source: live_validation
Triggering alerts
  • validation_stale • medium • Validation evidence is stale
Client verdict trace table
VerdictStatusChecksWinning sourceConflicts
openai_connectors Ready initialize, tools_list, transport_compliance_probe, step_up_auth_probe, connector_replay_probe, request_association_probe live_validation none
claude_desktop Ready initialize, tools_list, transport_compliance_probe live_validation none
unsafe_for_write_actions No action_safety_probe live_validation none
snapshot_churn_risk Low tool_snapshot_probe, connector_replay_probe history none

Publishability policy profiles

ChatGPT custom connector publishability
Ready
Transport compliance should be in good shape.
  • Search Fetch Only: No
  • Write Actions Present: Yes
  • Oauth Configured: Yes
  • Admin Refresh Required: No
  • Safe For Company Knowledge: No
  • Safe For Messages Api Remote Mcp: No
Claude remote MCP publishability
Ready
Transport behavior should match Claude-compatible HTTP expectations.
  • Search Fetch Only: No
  • Write Actions Present: Yes
  • Oauth Configured: Yes
  • Admin Refresh Required: No
  • Safe For Company Knowledge: No
  • Safe For Messages Api Remote Mcp: No

Compatibility fixtures

ChatGPT custom connector fixture
Passes
Transport compliance should be in good shape.
  • remote_http_endpoint: Passes
  • oauth_discovery: Passes
  • frozen_tool_snapshot_refresh: Passes
  • request_association: Passes
Anthropic remote MCP fixture
Degraded
Transport behavior should match Claude-compatible HTTP expectations.
  • remote_transport: Passes
  • tool_discovery: Passes
  • auth_connect: Passes
  • safe_write_review: Passes

Authenticated validation sessions

Latest profile
remote_mcp
Authenticated session used
Public score isolation
Preview endpoint
/v1/verify
CI preview endpoint
/v1/ci/preview

Public server reputation

Validation success 7d
n/a
Validation success 30d
1.0
Mean time to recover
n/a
Breaking diffs 30d
0
Registry drift frequency 30d
0
Snapshot changes 30d
0

Incident & change feed

TimestampEventDetails
Apr 09, 2026 12:56:41 AM UTC Latest validation: healthy Score 74.8 with status healthy.
Apr 07, 2026 12:48:39 AM UTC Score changed Score delta +0.5 versus the previous run.

Capabilities

Use-case taxonomy
development database search communication

Security posture

Tools analyzed
6
High-risk tools
5
Destructive tools
5
Exec tools
5
Egress tools
0
Secret tools
0
Bulk-access tools
0
Risk distribution
low:1, critical:5

Tool capability & risk inventory

ToolCapabilitiesRiskFindingsNotes
getExoQueryDocs read write delete exec network filesystem admin Critical destructive operation command execution freeform input surface filesystem mutation admin mutation No explicit safeguard hints detected.
getExoQueryDocsMulti read write delete exec network filesystem admin Critical destructive operation command execution filesystem mutation admin mutation No explicit safeguard hints detected.
listExoQueryDocs read Low none No explicit safeguard hints detected.
runRawSql read write delete exec admin Critical destructive operation command execution freeform input surface admin mutation Safeguards hinted in metadata.
validateAndRunExoquery read write delete exec filesystem admin Critical destructive operation command execution freeform input surface filesystem mutation admin mutation No explicit safeguard hints detected.
validateExoquery read write delete exec filesystem admin Critical destructive operation command execution freeform input surface filesystem mutation admin mutation No explicit safeguard hints detected.

Write-action governance

Governance status
Warning
Safe to publish
Auth boundary
oauth_or_auth_required
Blast radius
High
High-risk tools
5
Confirmation signals
none
Safeguard count
1

Status detail: 5 high-risk tool(s), 5 destructive tool(s), 5 exec-capable tool(s); auth boundary is oauth or auth required with 1 safeguard(s) and 0 confirmation signal(s).

ToolRiskFlagsSafeguards
getExoQueryDocs Critical destructive operation command execution freeform input surface filesystem mutation admin mutation no
getExoQueryDocsMulti Critical destructive operation command execution filesystem mutation admin mutation no
runRawSql Critical destructive operation command execution freeform input surface admin mutation yes
validateAndRunExoquery Critical destructive operation command execution freeform input surface filesystem mutation admin mutation no
validateExoquery Critical destructive operation command execution freeform input surface filesystem mutation admin mutation no

Action-controls diff

Snapshot changed
no
Disabled-by-default candidates
none
Manual review candidates
none
New actions
ActionRiskFlags
No newly added actions.
Changed actions
ActionChange typesRisk
No materially changed actions.

Why this score?

Access & Protocol
35/44
Connectivity, auth, and transport expectations for common clients.
Interface Quality
35.88/56
How well the tool/resource interface communicates and behaves under automation.
Security Posture
24.25/36
How safely the exposed tool surface handles destructive actions, egress, execution, secrets, and risky inputs.
Reliability & Trust
23/24
Operational stability, consistency, and trustworthiness over time.
Discovery & Governance
23.5/28
How well the server is documented, listed, and governed in public registries.
Adoption & Market
5/8
Adoption clues and public evidence that the server is intended for external use.

Algorithmic score breakdown

Auth Operability
4/4
Measures whether auth discovery and protected access behave predictably for clients.
Error Contract Quality
0.5/4
Grades machine-readable error structure, status alignment, and remediation hints.
Rate-Limit Semantics
2/4
Checks whether quota/throttle responses are deterministic and automation-friendly.
Schema Completeness
3/4
Completeness of tool descriptions, parameter docs, examples, and schema shape.
Backward Compatibility
4/4
Stability score across tool schema/name drift relative to prior validations.
SLO Health
4/4
Availability, latency, and burst-failure profile across recent validation history.
Security Hygiene
2/4
HTTPS posture, endpoint hygiene, and response-surface hardening checks.
Task Success
4/4
Can an agent reliably initialize, enumerate tools, and execute core MCP flows?
Trust Confidence
4/4
Confidence-adjusted reliability score that penalizes low evidence volume.
Abuse/Noise Resilience
3/4
How well the server preserves core behavior in the presence of noisy traffic patterns.
Prompt Contract
2/4
Quality of prompt metadata, argument shape, and prompt discoverability for clients.
Resource Contract
2/4
How completely resources and resource templates describe URIs, types, and usage shape.
Discovery Metadata
3/4
Homepage, docs, icon, repository, support, and license coverage for directory consumers.
Registry Consistency
2/4
Agreement between stored registry metadata, live server-card data, and current validation output.
Installability
4/4
How cleanly a real client can connect, initialize, enumerate tools, and proceed through auth.
Session Semantics
4/4
Determinism and state behavior across repeated MCP calls, including sticky-session surprises.
Tool Surface Design
3/4
Naming clarity, schema ergonomics, and parameter complexity across the tool surface.
Result Shape Stability
3/4
Stability of declared output schemas across validations, with penalties for drift or missing shapes.
OAuth Interop
4/4
Depth and client compatibility of OAuth/OIDC metadata beyond the minimal protected-resource check.
Recovery Semantics
0.4/4
Whether failures include actionable machine-readable next steps such as retry or upgrade guidance.
Maintenance Signal
4/4
Versioning, update recency, and historical validation cadence that indicate active stewardship.
Adoption Signal
2/4
Directory presence and distribution clues that suggest the server is intended for external use.
Freshness Confidence
4/4
Confidence that recent validations are current enough and dense enough to trust operationally.
Transport Fidelity
4/4
Whether declared transport metadata matches the observed endpoint behavior and response formats.
Spec Recency
2/4
How close the server’s claimed MCP protocol version is to the latest known public revision.
Session Resume
3/4
Whether Streamable HTTP session identifiers and resumed requests behave cleanly for real clients.
Step-Up Auth
4/4
Whether OAuth metadata and WWW-Authenticate challenges support granular, incremental consent instead of broad upfront scopes.
Transport Compliance
0/4
Checks session headers, protocol-version enforcement, session teardown, and expired-session behavior.
Utility Coverage
3/4
Signals support for completions, pagination, and task-oriented utility surfaces that larger clients increasingly expect.
Advanced Capability Coverage
3/4
Coverage of newer MCP surfaces like roots, sampling, elicitation, structured output, and related metadata.
Connector Publishability
3/4
How ready the server looks for client catalogs and managed connector programs.
Tool Snapshot Churn
4/4
Stability of the tool surface across recent validations, including add/remove and output-shape drift.
Connector Replay
4/4
Whether a previously published frozen connector snapshot would remain backward compatible after the latest tool refresh.
Request Association
3/4
Whether roots, sampling, and elicitation appear tied to active client requests instead of arriving unsolicited on idle sessions.
Interactive Flow Safety
3/4
Whether prompts and docs steer users toward safe auth flows instead of pasting secrets directly.
Action Safety
2/4
Risk-weighted view of destructive, exec, egress, and confirmation semantics across the tool surface.
Official Registry Presence
4/4
Whether the server appears directly or indirectly in the official MCP registry.
Provenance Divergence
4/4
How closely official registry metadata, the live server card, and public repo/package signals agree with each other.
Safety Transparency
4/4
Clarity of docs, auth disclosure, support links, and other trust signals visible to integrators.
Tool Capability Clarity
3/4
How clearly the tool surface communicates whether each action reads, writes, deletes, executes, or exports data.
Destructive Operation Safety
2/4
Penalizes delete/revoke/destroy style tools unless auth and safeguards reduce blast radius.
Egress / SSRF Resilience
3/4
Assesses arbitrary URL fetch, crawl, webhook, and remote-request exposure on the tool surface.
Execution / Sandbox Safety
3.2/4
Evaluates shell, code, script, and command-execution exposure and whether that surface appears contained.
Data Exfiltration Resilience
3/4
Assesses export, dump, backup, and bulk-read behavior against the surrounding auth and safeguard signals.
Least Privilege Scope
3/4
Rewards scoped auth metadata and penalizes broad or missing scopes around privileged tools.
Secret Handling Hygiene
3/4
Assesses secret-bearing tools, token leakage risk, and whether the public surface avoids obvious secret exposure.
Supply Chain Signal
2.5/4
Public metadata signal for repository, changelog, license, versioning, and recency that supports supply-chain trust.
Input Sanitization Safety
3/4
Penalizes risky freeform string inputs when schemas do not constrain URLs, code, paths, queries, or templates.
Tool Namespace Clarity
3/4
Measures naming uniqueness and ambiguity across the tool namespace to reduce collision and confusion risk.

Compatibility profiles

OpenAI Connectors
88.9
compatible
Transport compliance should be in good shape.
Connector URL: https://backend.exoquery.com/mcp
# Complete OAuth in the client when prompted.
# Server: com.exoquery/mcp-server
Claude Desktop
83.3
compatible
Transport behavior should match Claude-compatible HTTP expectations.
{
  "mcpServers": {
    "mcp-server": {
      "command": "npx",
      "args": ["mcp-remote", "https://backend.exoquery.com/mcp"]
    }
  }
}
Smithery
80.0
compatible
Machine-readable failure semantics should be present.
smithery mcp add "https://backend.exoquery.com/mcp"
Generic Streamable HTTP
100.0
compatible
No major blockers detected.
curl -sS https://backend.exoquery.com/mcp -H 'content-type: application/json' -d '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2025-03-26","capabilities":{},"clientInfo":{"name":"mcp-verify","version":"0.1.0"}}}'

Actionable remediation

SeverityRemediationWhy it mattersRecommended action
High Add confirmation and dry-run semantics for risky actions High-risk write, delete, exec, or egress tools should communicate safeguards clearly. Inspect the latest validation evidence and resolve the client-visible regression.
Playbook
  • Inspect the latest validation evidence.
  • Resolve the highest-severity client-facing gap first.
  • Revalidate and confirm the score and verdict improve.
High Align session and protocol behavior with Streamable HTTP expectations Clients increasingly rely on MCP-Protocol-Version, session teardown, and expired-session semantics. Align MCP-Protocol-Version, MCP-Session-Id, DELETE teardown, and expired-session handling with the transport spec.
Playbook
  • Return `Mcp-Session-Id` and `Mcp-Protocol-Version` headers consistently on streamable HTTP responses.
  • Honor `DELETE` session teardown and return `404` when a deleted session is reused.
  • Reject invalid protocol-version headers with `400 Bad Request`.
High Associate roots, sampling, and elicitation with active client requests Modern MCP guidance expects roots, sampling, and elicitation traffic to be tied to an active client request instead of arriving unsolicited on idle sessions. Inspect the latest validation evidence and resolve the client-visible regression.
Playbook
  • Inspect the latest validation evidence.
  • Resolve the highest-severity client-facing gap first.
  • Revalidate and confirm the score and verdict improve.
High Publish a complete server card Missing or incomplete server-card metadata weakens discovery, documentation, and trust signals. Serve /.well-known/mcp/server-card.json and include tools, prompts/resources, homepage, and support links.
Playbook
  • Publish `/.well-known/mcp/server-card.json`.
  • Include homepage, repository, support, tools, prompts/resources, and auth metadata.
  • Revalidate the server after publishing the card.
Medium Adopt a current MCP protocol revision Older protocol revisions reduce compatibility with newer clients and registry programs. Inspect the latest validation evidence and resolve the client-visible regression.
Playbook
  • Inspect the latest validation evidence.
  • Resolve the highest-severity client-facing gap first.
  • Revalidate and confirm the score and verdict improve.
Medium Close connector-publishing gaps Connector catalogs care about protocol recency, session behavior, auth clarity, and tool-surface stability. Inspect the latest validation evidence and resolve the client-visible regression.
Playbook
  • Inspect the latest validation evidence.
  • Resolve the highest-severity client-facing gap first.
  • Revalidate and confirm the score and verdict improve.
Medium Publish OpenID configuration OIDC metadata improves token validation and client compatibility. Expose /.well-known/openid-configuration with issuer, jwks_uri, and supported grants.
Playbook
  • Inspect the latest validation evidence.
  • Resolve the highest-severity client-facing gap first.
  • Revalidate and confirm the score and verdict improve.
Medium Respond to validation evidence is stale Latest validation is 605.0 hours old. Trigger a fresh validation run or increase scheduler priority for this server.
Playbook
  • Queue a new validation run now.
  • Inspect whether the scheduler priority should be raised for this server.
  • Do not rely on stale evidence for production decisions.
Medium Support resumable HTTP sessions cleanly Modern MCP clients increasingly expect resumable session behavior on streamable HTTP transports. Inspect the latest validation evidence and resolve the client-visible regression.
Playbook
  • Inspect the latest validation evidence.
  • Resolve the highest-severity client-facing gap first.
  • Revalidate and confirm the score and verdict improve.
Low Expose modern utility surfaces like completions, pagination, or tasks Utility coverage improves interoperability with larger clients and long-lived agent workflows. Expose completions, pagination, and task metadata where supported so larger clients can plan and resume work safely.
Playbook
  • Advertise `completions`, pagination cursors, and `tasks` only when they are actually supported.
  • Return `nextCursor` on large list operations when pagination is available.
  • Document task support and whether it requires step-up auth.
Low Harden generic GET handling Simple probe requests should not surface server instability or noisy failures. Harden generic GET handlers around the origin of https://backend.exoquery.com/mcp so incidental traffic does not produce noisy failures.
Playbook
  • Inspect the latest validation evidence.
  • Resolve the highest-severity client-facing gap first.
  • Revalidate and confirm the score and verdict improve.
Low Publish newer MCP capability signals Roots, sampling, elicitation, structured outputs, and related metadata improve client understanding and ranking. Inspect the latest validation evidence and resolve the client-visible regression.
Playbook
  • Inspect the latest validation evidence.
  • Resolve the highest-severity client-facing gap first.
  • Revalidate and confirm the score and verdict improve.

Point loss breakdown

ComponentCurrentPoints missing
Transport Compliance 0/4 -4.0
Recovery Semantics 0.4/4 -3.6
Error Contract 0.5/4 -3.5
Spec Recency 2/4 -2.0
Security Hygiene 2/4 -2.0
Resource Contract 2/4 -2.0
Registry Consistency 2/4 -2.0
Rate Limit Semantics 2/4 -2.0
Prompt Contract 2/4 -2.0
Destructive Operation Safety 2/4 -2.0
Adoption Signal 2/4 -2.0
Action Safety 2/4 -2.0

Validation diff

Score delta
0
Summary changed
no
Tool delta
0
Prompt delta
0
Auth mode changed
no
Write surface expanded
no
Protocol regressed
no
Registry drift changed
no

Regressed checks: none

Improved checks: none

ComponentPreviousLatestDelta
No component deltas between the latest two runs.

Tool snapshot diff & changelog

Snapshot changed
no
Added tools
none
Removed tools
none
Required-argument changes
ToolAdded required argsRemoved required args
No required-argument changes detected.
Output-schema drift
ToolPrevious propertiesLatest properties
No output-schema drift detected.

Connector replay

Status
OK
Backward compatible
Would break after refresh
Added tools
none
Removed tools
none
Additive output changes
none
Required-argument replay breaks
ToolAdded required argsRemoved required args
No required-argument replay breaks detected.
Output-schema replay breaks
ToolRemoved propertiesAdded properties
No output-schema replay breaks detected.

Transport compliance drilldown

Probe status
Error
Transport
streamable-http
Session header
no
Protocol header
no
Bad protocol response
200
DELETE teardown
n/a
Expired session retry
n/a
Last-Event-ID visible
no

Issues: missing_session_id, missing_protocol_header, bad_protocol_not_rejected

Request association

Status
Missing
Advertised capabilities
none
Observed idle methods
none
Violating methods
none
Probe HTTP status
n/a
Issues
none

Utility coverage

Probe status
Warning
Completions
advertised
Completion probe target: none
Pagination
not detected
No nextCursor evidence.
Tasks
Missing
Advertised: no

Benchmark tasks

Benchmark taskStatusEvidence
Discover tools Passes
  • initializeOK
  • tools_listOK
Read-only fetch flow Degraded
  • resource_readMissing
  • read_only_tool_surfaceOK
OAuth-required connect Passes
  • oauth_protected_resourceOK
  • step_up_auth_probeOK
Safe write flow with confirmation Degraded
  • action_safety_probeWarning

Registry & provenance divergence

Probe status
OK
Direct official match
yes
Drift fields
none
FieldRegistryLive server card
Titlen/an/a
Versionn/an/a
Homepagen/an/a

Active alerts

Aliases & registry graph

IdentifierSourceCanonicalScore
com.exoquery/mcp-server official_registry yes 74.81

Alias consolidation

Canonical identifier
com.exoquery/mcp-server
Duplicate aliases
0
Registry sources
official_registry
Homepages
none
Source disagreements
FieldWhat differsObserved values
No source disagreements detected.

Install snippets

Openai Connectors
Connector URL: https://backend.exoquery.com/mcp
# Complete OAuth in the client when prompted.
# Server: com.exoquery/mcp-server
Claude Desktop
{
  "mcpServers": {
    "mcp-server": {
      "command": "npx",
      "args": ["mcp-remote", "https://backend.exoquery.com/mcp"]
    }
  }
}
Smithery
smithery mcp add "https://backend.exoquery.com/mcp"
Generic Http
curl -sS https://backend.exoquery.com/mcp -H 'content-type: application/json' -d '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2025-03-26","capabilities":{},"clientInfo":{"name":"mcp-verify","version":"0.1.0"}}}'

Agent access & tool surface

Live server tools
getExoQueryDocs getExoQueryDocsMulti listExoQueryDocs runRawSql validateAndRunExoquery validateExoquery
Observed from the latest live validation against https://backend.exoquery.com/mcp. This is the target server surface, not Verify's own inspection tools.
Live capability counts
6 tools • 0 prompts • 0 resources
Counts come from the latest tools/list, prompts/list, and resources/list checks.
Inspect with Verify
search_servers recommend_servers get_server_report compare_servers
Use Verify itself to search, recommend, compare, and fetch the full report for com.exoquery/mcp-server.
Direct machine links

Claims & monitoring

Server ownership

No verified maintainer claim recorded.

Watch subscriptions
0
Teams: none

Alert routing

Active watches
0
Generic webhooks
0
Slack routes
0
Teams routes
0
Email routes
0
WatchTeamChannelsMinimum severity
No active watch destinations.

Maintainer analytics

Validation Run Count
20
Average Latency Ms
1064.42
Healthy Run Ratio Recent
1.0
Registry Presence Count
1
Active Alert Count
1
Watcher Count
0
Verified Claim
False
Taxonomy Tags
development, database, search, communication
Score Trend
74.81, 74.81, 74.81, 74.3, 74.3, 74.3, 74.3, 74.3, 74.3, 74.3
Remediation Count
12
High Risk Tool Count
5
Destructive Tool Count
5
Exec Tool Count
5

Maintainer response quality

Score
16.67
Verified claim
Support contact
Changelog present
Incident notes present
Tool changes documented
Annotation history
Annotation count
0

Maintainer annotations

No maintainer annotations have been recorded yet.

Maintainer rebuttals & expected behavior

No maintainer rebuttals or expected-behavior overrides are recorded yet.

Latest validation evidence

Latest summary
Healthy
Validation profile
remote_mcp
Started
Apr 09, 2026 12:56:40 AM UTC
Latency
782.3 ms

Failures

Checks

CheckStatusLatencyEvidence
action_safety_probe Warning n/a 5 high-risk, 5 destructive, 5 exec-capable tool(s); auth present; safeguards=1; confirmation=none.
advanced_capabilities_probe Warning n/a Only 3 capability signal(s): completions, prompts, resources.
connector_publishability_probe Warning n/a Publishability blockers: transport compliance, server card.
connector_replay_probe OK n/a Backward compatible with no breaking tool-surface changes.
determinism_probe OK 45.2 ms Check completed
initialize OK 44.3 ms Protocol 2025-03-26
interactive_flow_probe OK n/a Check completed
oauth_authorization_server OK 51.6 ms authorization_endpoint, code_challenge_methods_supported, grant_types_supported, issuer
oauth_protected_resource OK 51.5 ms 1 authorization server(s)
official_registry_probe OK n/a Check completed
openid_configuration Error 17.1 ms Client error '403 Forbidden' for url 'https://backend.exoquery.com/.well-known/openid-configuration' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403
probe_noise_resilience Error 19.2 ms Fetched https://backend.exoquery.com/robots.txt
prompt_get Missing n/a not advertised
prompts_list OK 37.2 ms 0 prompt(s) exposed
protocol_version_probe Warning n/a Claims 2025-03-26; 2 release(s) behind 2025-11-25.
provenance_divergence_probe OK n/a Check completed
request_association_probe Missing n/a No request-association capabilities were advertised.
resource_read Missing n/a not advertised
resources_list OK 39.4 ms 0 resource item(s) exposed
server_card Error 111.1 ms Client error '403 Forbidden' for url 'https://backend.exoquery.com/.well-known/mcp/server-card.json' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403
session_resume_probe Warning n/a no session id
step_up_auth_probe OK n/a Scopes=mcp.
tool_snapshot_probe OK n/a Check completed
tools_list OK 40.5 ms 6 tool(s) exposed
transport_compliance_probe Error 43.4 ms Issues: missing session id, missing protocol header, bad protocol not rejected (bad protocol=200).
utility_coverage_probe Warning 162.5 ms Completions advertised; no pagination evidence; tasks missing.

Raw evidence view

Show raw JSON evidence
{
  "checks": {
    "action_safety_probe": {
      "details": {
        "auth_present": true,
        "confirmation_signals": [],
        "safeguard_count": 1,
        "summary": {
          "bulk_access_tools": 0,
          "capability_distribution": {
            "admin": 5,
            "delete": 5,
            "exec": 5,
            "filesystem": 4,
            "network": 2,
            "read": 6,
            "write": 5
          },
          "destructive_tools": 5,
          "egress_tools": 0,
          "exec_tools": 5,
          "high_risk_tools": 5,
          "risk_distribution": {
            "critical": 5,
            "high": 0,
            "low": 1,
            "medium": 0
          },
          "secret_tools": 0,
          "tool_count": 6
        }
      },
      "latency_ms": null,
      "status": "warning"
    },
    "advanced_capabilities_probe": {
      "details": {
        "capabilities": {
          "completions": true,
          "elicitation": false,
          "prompts": true,
          "resource_links": false,
          "resources": true,
          "roots": false,
          "sampling": false,
          "structured_outputs": false
        },
        "enabled": [
          "completions",
          "prompts",
          "resources"
        ],
        "enabled_count": 3,
        "initialize_capability_keys": [
          "completions",
          "prompts",
          "resources",
          "tools"
        ]
      },
      "latency_ms": null,
      "status": "warning"
    },
    "connector_publishability_probe": {
      "details": {
        "blockers": [
          "transport_compliance",
          "server_card"
        ],
        "criteria": {
          "action_safety": true,
          "auth_flow": true,
          "connector_replay": true,
          "initialize": true,
          "protocol_version": true,
          "remote_transport": true,
          "request_association": true,
          "server_card": false,
          "session_resume": true,
          "step_up_auth": true,
          "tool_surface": true,
          "tools_list": true,
          "transport_compliance": false
        },
        "high_risk_tools": 5,
        "tool_count": 6,
        "transport": "streamable-http"
      },
      "latency_ms": null,
      "status": "warning"
    },
    "connector_replay_probe": {
      "details": {
        "added_tools": [],
        "additive_output_changes": [],
        "backward_compatible": true,
        "output_breaks": [],
        "removed_tools": [],
        "required_arg_breaks": [],
        "would_break_after_refresh": false
      },
      "latency_ms": null,
      "status": "ok"
    },
    "determinism_probe": {
      "details": {
        "attempts": 2,
        "baseline_signature": "a605b09b132fce76ad210533bb79e99eb23e85baf0b1b448c4b024ff3cc6f9dd",
        "errors": [],
        "matches": 2,
        "stable_ratio": 1.0,
        "successful": 2
      },
      "latency_ms": 45.2,
      "status": "ok"
    },
    "initialize": {
      "details": {
        "headers": {
          "content-type": "application/json",
          "strict-transport-security": "max-age=31536000 ; includeSubDomains"
        },
        "http_status": 200,
        "payload": {
          "id": 1,
          "jsonrpc": "2.0",
          "result": {
            "capabilities": {
              "completions": {},
              "prompts": {
                "listChanged": false
              },
              "resources": {
                "listChanged": false,
                "subscribe": false
              },
              "tools": {
                "listChanged": false
              }
            },
            "protocolVersion": "2025-03-26",
            "serverInfo": {
              "name": "exoquery-mcp",
              "version": "1.0.0"
            }
          }
        },
        "url": "https://backend.exoquery.com/mcp"
      },
      "latency_ms": 44.27,
      "status": "ok"
    },
    "interactive_flow_probe": {
      "details": {
        "oauth_supported": true,
        "prompt_available": false,
        "risk_hits": [],
        "safe_hits": []
      },
      "latency_ms": null,
      "status": "ok"
    },
    "oauth_authorization_server": {
      "details": {
        "headers": {
          "content-type": "application/json",
          "strict-transport-security": "max-age=31536000 ; includeSubDomains"
        },
        "http_status": 200,
        "payload": {
          "authorization_endpoint": "https://backend.exoquery.com/oauth/authorize",
          "code_challenge_methods_supported": [
            "S256"
          ],
          "grant_types_supported": [
            "authorization_code",
            "refresh_token"
          ],
          "issuer": "https://backend.exoquery.com",
          "jwks_uri": "https://backend.exoquery.com/oauth/jwks",
          "registration_endpoint": "https://backend.exoquery.com/oauth/register",
          "response_types_supported": [
            "code"
          ],
          "scopes_supported": [
            "mcp"
          ],
          "token_endpoint": "https://backend.exoquery.com/oauth/token",
          "token_endpoint_auth_methods_supported": [
            "none",
            "client_secret_post"
          ]
        },
        "url": "https://backend.exoquery.com/.well-known/oauth-authorization-server"
      },
      "latency_ms": 51.57,
      "status": "ok"
    },
    "oauth_protected_resource": {
      "details": {
        "headers": {
          "content-type": "application/json",
          "strict-transport-security": "max-age=31536000 ; includeSubDomains"
        },
        "http_status": 200,
        "payload": {
          "authorization_servers": [
            "https://backend.exoquery.com"
          ],
          "resource": "https://backend.exoquery.com"
        },
        "url": "https://backend.exoquery.com/.well-known/oauth-protected-resource"
      },
      "latency_ms": 51.5,
      "status": "ok"
    },
    "official_registry_probe": {
      "details": {
        "direct_match": true,
        "official_peer_count": 1,
        "registry_identifier": "com.exoquery/mcp-server",
        "registry_source": "official_registry"
      },
      "latency_ms": null,
      "status": "ok"
    },
    "openid_configuration": {
      "details": {
        "error": "Client error '403 Forbidden' for url 'https://backend.exoquery.com/.well-known/openid-configuration'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403",
        "url": "https://backend.exoquery.com/.well-known/openid-configuration"
      },
      "latency_ms": 17.11,
      "status": "error"
    },
    "probe_noise_resilience": {
      "details": {
        "headers": {
          "content-type": "application/json"
        },
        "http_status": 403,
        "url": "https://backend.exoquery.com/robots.txt"
      },
      "latency_ms": 19.19,
      "status": "error"
    },
    "prompt_get": {
      "details": {
        "reason": "not_advertised"
      },
      "latency_ms": null,
      "status": "missing"
    },
    "prompts_list": {
      "details": {
        "headers": {
          "content-type": "application/json",
          "strict-transport-security": "max-age=31536000 ; includeSubDomains"
        },
        "http_status": 200,
        "payload": {
          "id": 3,
          "jsonrpc": "2.0",
          "result": {
            "prompts": []
          }
        },
        "url": "https://backend.exoquery.com/mcp"
      },
      "latency_ms": 37.22,
      "status": "ok"
    },
    "protocol_version_probe": {
      "details": {
        "claimed_version": "2025-03-26",
        "lag_days": 244,
        "latest_known_version": "2025-11-25",
        "releases_behind": 2,
        "validator_protocol_version": "2025-03-26"
      },
      "latency_ms": null,
      "status": "warning"
    },
    "provenance_divergence_probe": {
      "details": {
        "direct_official_match": true,
        "drift_fields": [],
        "metadata_document_count": 1,
        "registry_homepage": null,
        "registry_repository": null,
        "registry_title": null,
        "registry_version": null,
        "server_card_homepage": null,
        "server_card_repository": null,
        "server_card_title": null,
        "server_card_version": null
      },
      "latency_ms": null,
      "status": "ok"
    },
    "request_association_probe": {
      "details": {
        "reason": "no_request_association_capabilities_advertised"
      },
      "latency_ms": null,
      "status": "missing"
    },
    "resource_read": {
      "details": {
        "reason": "not_advertised"
      },
      "latency_ms": null,
      "status": "missing"
    },
    "resources_list": {
      "details": {
        "headers": {
          "content-type": "application/json",
          "strict-transport-security": "max-age=31536000 ; includeSubDomains"
        },
        "http_status": 200,
        "payload": {
          "id": 5,
          "jsonrpc": "2.0",
          "result": {
            "resources": []
          }
        },
        "url": "https://backend.exoquery.com/mcp"
      },
      "latency_ms": 39.4,
      "status": "ok"
    },
    "server_card": {
      "details": {
        "error": "Client error '403 Forbidden' for url 'https://backend.exoquery.com/.well-known/mcp/server-card.json'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403",
        "url": "https://backend.exoquery.com/.well-known/mcp/server-card.json"
      },
      "latency_ms": 111.11,
      "status": "error"
    },
    "session_resume_probe": {
      "details": {
        "protocol_version": "2025-03-26",
        "reason": "no_session_id",
        "resume_expected": true,
        "transport": "streamable-http"
      },
      "latency_ms": null,
      "status": "warning"
    },
    "step_up_auth_probe": {
      "details": {
        "auth_required_checks": [],
        "broad_scopes": [],
        "challenge_headers": [],
        "minimal_scope_documented": true,
        "oauth_present": true,
        "scope_specificity_ratio": 0.5,
        "step_up_signals": [],
        "supported_scopes": [
          "mcp"
        ]
      },
      "latency_ms": null,
      "status": "ok"
    },
    "tool_snapshot_probe": {
      "details": {
        "added": [],
        "changed_outputs": [],
        "current_tool_count": 6,
        "previous_tool_count": 6,
        "removed": [],
        "similarity": 1.0
      },
      "latency_ms": null,
      "status": "ok"
    },
    "tools_list": {
      "details": {
        "headers": {
          "content-type": "application/json",
          "strict-transport-security": "max-age=31536000 ; includeSubDomains"
        },
        "http_status": 200,
        "payload": {
          "id": 2,
          "jsonrpc": "2.0",
          "result": {
            "tools": [
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nAccess comprehensive ExoQuery documentation organized by topic and category.\n\nExoQuery is a Language Integrated Query library for Kotlin Multiplatform that translates Kotlin DSL expressions into SQL at compile time. This resource provides access to the complete documentation covering all aspects of the library.\n\nAVAILABLE DOCUMENTATION CATEGORIES:\n\n1. **Getting Started**\n   - Introduction: What ExoQuery is and why it exists\n   - Installation: Project setup and dependencies\n   - Quick Start: First query in minutes\n\n2. **Core Concepts**\n   - SQL Blocks: The sql { } construct and query building\n   - Parameters: Safe runtime data handling\n   - Composing Queries: Functional query composition\n\n3. **Query Operations**\n   - Basic Operations: Map, filter, and transformations\n   - Joins: Inner, left, and implicit joins\n   - Grouping: GROUP BY and HAVING clauses\n   - Sorting: ORDER BY operations\n   - Subqueries: Correlated and nested queries\n   - Window Functions: Advanced analytics\n\n4. **Actions**\n   - Insert: INSERT with returning and conflict handling\n   - Update: UPDATE operations with setParams\n   - Delete: DELETE with returning\n   - Batch Operations: Bulk inserts and updates\n\n5. **Advanced Features**\n   - SQL Fragment Functions: Reusable SQL components with @SqlFragment\n   - Dynamic Queries: Runtime query generation with @SqlDynamic\n   - Free Blocks: Custom SQL and user-defined functions\n   - Transactions: Transaction support patterns\n   - Polymorphic Queries: Interfaces, sealed classes, higher-order functions\n   - Local Variables: Variables within SQL blocks\n\n6. **Data Handling**\n   - Serialization: kotlinx.serialization integration\n   - Custom Type Encoding: Custom encoders and decoders\n   - JSON Columns: JSON and JSONB support (PostgreSQL)\n   - Column Naming: @SerialName and @ExoEntity annotations\n   - Nested Datatypes: Complex data structures\n   - Kotlinx Integration: JSON and other serialization formats\n\n7. **Schema-First Development**\n   - Entity Generation: Compile-time code generation from database schema\n   - AI-Enhanced Entities: Using LLMs to generate cleaner entity code\n\n8. **Reference**\n   - SQL Functions: Available string, math, and date functions\n   - API Reference: Core types and function signatures\n\nHOW TO USE THIS RESOURCE:\n\nThe resource URI follows the pattern:\n  exoquery://docs/{file-path}\n\nWhere {file-path} is the relative path from the docs root, e.g.:\n  - exoquery://docs/01-getting-started/01-introduction.md\n  - exoquery://docs/03-query-operations/02-joins.md\n  - exoquery://docs/05-advanced-features/01-sql-fragments.md\n\nTo discover available documents, use the MCP resources/list endpoint which will return all available documentation files with their titles, descriptions, and categories.\n\nEach document includes:\n- Title and description\n- Category classification\n- Complete markdown content with code examples\n- Cross-references to related topics\n\nWHEN TO USE:\n- User asks about ExoQuery syntax, features, or capabilities\n- User needs examples of specific query patterns\n- User encounters errors and needs to verify correct usage\n- User wants to understand advanced features or best practices\n",
                "inputSchema": {
                  "properties": {
                    "filePath": {
                      "description": "\nThe documentation file path to retrieve.\n\nFormat: Relative path from docs root (e.g., \"01-getting-started/01-introduction.md\")\n\nThe full URI is: exoquery://docs/{file-path}\n\nTo find available file paths, use the MCP resources/list endpoint which returns metadata for all documentation files including their paths, titles, categories, and descriptions.\n\nCommon paths:\n- Getting Started: 01-getting-started/01-introduction.md, 01-getting-started/02-installation.md, 01-getting-started/03-quick-start.md\n- Core Concepts: 02-core-concepts/01-sql-blocks.md, 02-core-concepts/02-parameters.md, 02-core-concepts/03-composing-queries.md\n- Query Operations: 03-query-operations/01-basic-operations.md, 03-query-operations/02-joins.md, 03-query-operations/03-grouping.md\n- Actions: 04-actions/01-insert.md, 04-actions/02-update.md, 04-actions/03-delete.md\n- Advanced: 05-advanced-features/01-sql-fragments.md, 05-advanced-features/02-dynamic-queries.md\n- Data Handling: 06-data-handling/03-json-columns.md, 06-data-handling/04-column-naming.md\n",
                      "type": "string"
                    }
                  },
                  "required": [
                    "filePath"
                  ],
                  "type": "object"
                },
                "name": "getExoQueryDocs",
                "title": "getExoQueryDocs"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nAccess multiple ExoQuery documentation sections simultaneously.\n\nThis tool is similar to the single-document retrieval tool but allows fetching multiple documentation files in a single request. This is particularly useful when you need to gather information from several related topics at once.\n\nExoQuery is a Language Integrated Query library for Kotlin Multiplatform that translates Kotlin DSL expressions into SQL at compile time. This resource provides access to the complete documentation covering all aspects of the library.\n\nHOW TO USE THIS RESOURCE:\n\nProvide a list of file paths, where each path is the relative path from the docs root, e.g.:\n  - 01-getting-started/01-introduction.md\n  - 03-query-operations/02-joins.md\n  - 05-advanced-features/01-sql-fragments.md\n\nTo discover available documents, use the MCP resources/list endpoint which will return all available documentation files with their titles, descriptions, and categories.\n\nEach returned document includes:\n- Title and description\n- Category classification\n- Complete markdown content with code examples\n- Cross-references to related topics\n\nWHEN TO USE:\n- User asks about multiple ExoQuery topics that require information from different sections\n- User needs to compare or understand relationships between different features\n- User wants to get comprehensive information across multiple categories\n- More efficient than making multiple single-document requests\n",
                "inputSchema": {
                  "properties": {
                    "filePaths": {
                      "description": "\nA list of documentation file paths to retrieve.\n\nFormat: List of relative paths from docs root (e.g., [\"01-getting-started/01-introduction.md\", \"03-query-operations/02-joins.md\"])\n\nEach path follows the pattern used in single-document retrieval: {category-folder}/{file-name}.md\n\nTo find available file paths, use the MCP resources/list endpoint which returns metadata for all documentation files including their paths, titles, categories, and descriptions.\n\nCommon paths:\n- Getting Started: 01-getting-started/01-introduction.md, 01-getting-started/02-installation.md, 01-getting-started/03-quick-start.md\n- Core Concepts: 02-core-concepts/01-sql-blocks.md, 02-core-concepts/02-parameters.md, 02-core-concepts/03-composing-queries.md\n- Query Operations: 03-query-operations/01-basic-operations.md, 03-query-operations/02-joins.md, 03-query-operations/03-grouping.md\n- Actions: 04-actions/01-insert.md, 04-actions/02-update.md, 04-actions/03-delete.md\n- Advanced: 05-advanced-features/01-sql-fragments.md, 05-advanced-features/02-dynamic-queries.md\n- Data Handling: 06-data-handling/03-json-columns.md, 06-data-handling/04-column-naming.md\n",
                      "items": {
                        "type": "string"
                      },
                      "type": "array"
                    }
                  },
                  "required": [
                    "filePaths"
                  ],
                  "type": "object"
                },
                "name": "getExoQueryDocsMulti",
                "title": "getExoQueryDocsMulti"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "Lists all available ExoQuery documentation resources with their metadata",
                "inputSchema": {
                  "properties": {},
                  "required": [],
                  "type": "object"
                },
                "name": "listExoQueryDocs",
                "title": "listExoQueryDocs"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nExecute raw, client-provided SQL queries against an ephemeral database initialized with the provided schema.\nReturns query results in a simple JSON format with column headers and row data as a 2D array.\n\nThe database type (SQLite or Postgres) is specified via the databaseType parameter:\n- SQLITE: In-memory, lightweight, uses standard SQLite syntax\n- POSTGRES: Temporary isolated schema with dedicated user, uses PostgreSQL syntax and features\n\nWHEN TO USE: When you need to run your own hand-written SQL queries to test database behavior or\ncompare the output with ExoQuery results from validateAndRunExoquery. This lets you verify that\nExoQuery-generated SQL produces the same results as your expected SQL.\n\nINPUT REQUIREMENTS:\n- query: A valid SQL query (SELECT, INSERT, UPDATE, DELETE, etc.)\n- schema: SQL schema with CREATE TABLE and INSERT statements to initialize the test database\n- databaseType: Either \"SQLITE\" or \"POSTGRES\" (defaults to SQLITE if not specified)\n\nOUTPUT FORMAT:\n\nOn success, returns JSON with the SQL query and a 2D array of results:\n{\"sql\":\"SELECT * FROM users ORDER BY id\",\"output\":[[\"id\",\"name\",\"age\"],[\"1\",\"Alice\",\"30\"],[\"2\",\"Bob\",\"25\"],[\"3\",\"Charlie\",\"35\"]]}\n\nOutput format details:\n- First array element contains column headers\n- Subsequent array elements contain row data\n- All values are returned as strings\n\nOn error, returns JSON with error message and the attempted query (if available):\n{\"error\":\"Query execution failed: no such table: USERS\",\"sql\":\"SELECT * FROM USERS\"}\n\nOr if schema initialization fails:\n{\"error\":\"Database initialization failed due to: near \\\"CREAT\\\": syntax error\\\\nWhen executing the following statement:\\\\n--------\\\\nCREAT TABLE users ...\\\\n--------\",\"sql\":\"CREAT TABLE users ...\"}\n\nEXAMPLE INPUT:\n\nQuery:\nSELECT * FROM users ORDER BY id\n\nSchema:\nCREATE TABLE users (\n  id INTEGER PRIMARY KEY,\n  name TEXT NOT NULL,\n  age INTEGER\n);\n\nINSERT INTO users (id, name, age) VALUES (1, 'Alice', 30);\nINSERT INTO users (id, name, age) VALUES (2, 'Bob', 25);\nINSERT INTO users (id, name, age) VALUES (3, 'Charlie', 35);\n\nEXAMPLE SUCCESS OUTPUT:\n{\"sql\":\"SELECT * FROM users ORDER BY id\",\"output\":[[\"id\",\"name\",\"age\"],[\"1\",\"Alice\",\"30\"],[\"2\",\"Bob\",\"25\"],[\"3\",\"Charlie\",\"35\"]]}\n\nEXAMPLE ERROR OUTPUT (bad table name):\n{\"error\":\"Query execution failed: no such table: invalid_table\",\"sql\":\"SELECT * FROM invalid_table\"}\n\nEXAMPLE ERROR OUTPUT (bad schema):\n{\"error\":\"Database initialization failed due to: near \\\"CREAT\\\": syntax error\\\\nWhen executing the following statement:\\\\n--------\\\\nCREAT TABLE users (id INTEGER)\\\\n--------\\\\nCheck that the initialization SQL is valid and compatible with SQLite.\",\"sql\":\"CREAT TABLE users (id INTEGER)\"}\n\nCOMMON QUERY EXAMPLES:\n\nSelect all rows:\nSELECT * FROM users\n\nSelect specific columns with filtering:\nSELECT name, age FROM users WHERE age > 25\n\nAggregate functions:\nSELECT COUNT(*) as total FROM users\n\nJoin queries:\nSELECT u.name, o.total FROM users u JOIN orders o ON u.id = o.user_id\n\nInsert data:\nINSERT INTO users (name, age) VALUES ('David', 40)\n\nUpdate data:\nUPDATE users SET age = 31 WHERE name = 'Alice'\n\nDelete data:\nDELETE FROM users WHERE age < 25\n\nCount with grouping:\nSELECT age, COUNT(*) as count FROM users GROUP BY age\n\nSCHEMA RULES:\n- Use standard SQLite syntax\n- Table names are case-sensitive (use lowercase for simplicity or quote names)\n- Include INSERT statements to populate test data for meaningful results\n- Supported data types: INTEGER, TEXT, REAL, BLOB, NULL\n- Use INTEGER PRIMARY KEY for auto-increment columns\n- Schema SQL is split on semicolons (;), so each statement after a ';' is executed separately\n- Avoid semicolons in comments as they will cause statement parsing issues\n\nCOMPARISON WITH EXOQUERY:\nThis tool is designed to work alongside validateAndRunExoquery for comparison purposes:\n1. Use validateAndRunExoquery to run ExoQuery Kotlin code and see the generated SQL + results\n2. Use runRawSql with your own hand-written SQL to verify you get the same output\n3. Compare the outputs to ensure ExoQuery generates the SQL you expect\n4. Test edge cases with plain SQL before writing equivalent ExoQuery code\n",
                "inputSchema": {
                  "properties": {
                    "query": {
                      "description": "\nA valid SQL query to execute against the database.\n\nCan be any valid SQL statement (syntax depends on databaseType parameter):\n- SELECT queries (with WHERE, JOIN, GROUP BY, ORDER BY, LIMIT, etc.)\n- INSERT statements\n- UPDATE statements\n- DELETE statements\n- DDL statements like CREATE/ALTER/DROP (applied after schema initialization)\n\nThe query will be executed against a database initialized with the provided schema parameter.\n\nExample:\nSELECT * FROM users WHERE age > 25 ORDER BY name\n",
                      "type": "string"
                    },
                    "schema": {
                      "description": "\nSQL schema to initialize the ephemeral test database.\n\nMust include:\n1. CREATE TABLE statements for all tables used in the query\n2. INSERT statements with test data\n\nUse syntax appropriate for the selected databaseType (SQLite or Postgres).\nTable names are case-sensitive. The schema is split on semicolons, so each statement is executed separately.\n\nExample:\nCREATE TABLE users (\n  id INTEGER PRIMARY KEY,\n  name TEXT NOT NULL,\n  age INTEGER\n);\n\nINSERT INTO users (id, name, age) VALUES (1, 'Alice', 30);\nINSERT INTO users (id, name, age) VALUES (2, 'Bob', 25);\nINSERT INTO users (id, name, age) VALUES (3, 'Charlie', 35);\n",
                      "type": "string"
                    }
                  },
                  "required": [
                    "query",
                    "schema"
                  ],
                  "type": "object"
                },
                "name": "runRawSql",
                "title": "runRawSql"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nCompile ExoQuery Kotlin code and EXECUTE it against an Sqlite database with provided schema.\nExoQuery is a compile-time SQL query builder that translates Kotlin DSL expressions into SQL.\n\nWHEN TO USE: When you need to verify ExoQuery produces correct results against actual data.\n\nINPUT REQUIREMENTS:\n- Complete Kotlin code (same requirements as validateExoquery)\n- SQL schema with CREATE TABLE and INSERT statements for test data\n- Data classes MUST exactly match the schema table structure\n- Column names in data classes must match schema (use @SerialName for snake_case columns)\n- Must include or or more .runSample() calls in main() to trigger SQL generation and execution\n  (note that .runSample() is NOT or real production use, use .runOn(database) instead)\n  \n\nOUTPUT FORMAT:\n\nReturns one or more JSON objects, each on its own line. Each object can be:\n\n1. SQL with output (query executed successfully):\n   {\"sql\": \"SELECT u.name FROM \\\"User\\\" u\", \"output\": \"[(name=Alice), (name=Bob)]\"}\n\n2. Output only (e.g., print statements, intermediate results):\n   {\"output\": \"Before: [(id=1, title=Ion Blend Beans)]\"}\n\n3. Error output (runtime errors, exceptions):\n   {\"outputErr\": \"java.sql.SQLException: Table \\\"USERS\\\" not found\"}\n\nMultiple results appear when code has multiple queries or print statements:\n\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25)]\"}\n{\"output\": \"Before:\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"Rows affected: 1\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25), (id=2, title=Luna Fuel Flask, unit_price=89.50, in_stock=6)]\"}\n\nCompilation errors return the same format as validateExoquery:\n{\n  \"errors\": {\n    \"File.kt\": [\n      {\n        \"interval\": {\"start\": {\"line\": 12, \"ch\": 10}, \"end\": {\"line\": 12, \"ch\": 15}},\n        \"message\": \"Type mismatch: inferred type is String but Int was expected\",\n        \"severity\": \"ERROR\",\n        \"className\": \"ERROR\"\n      }\n    ]\n  }\n}\n\nRuntime Errors can have the following format:\n{\n  \"errors\" : {\n    \"File.kt\" : [ ]\n  },\n  \"exception\" : {\n    \"message\" : \"[SQLITE_ERROR] SQL error or missing database (no such table: User)\",\n    \"fullName\" : \"org.sqlite.SQLiteException\",\n    \"stackTrace\" : [ {\n      \"className\" : \"org.sqlite.core.DB\",\n      \"methodName\" : \"newSQLException\",\n      \"fileName\" : \"DB.java\",\n      \"lineNumber\" : 1179\n    }, ...]\n  },\n  \"text\" : \"<outStream><outputObject>\\n{\\\"sql\\\": \\\"SELECT x.id, x.name, x.age FROM User x\\\"}\\n</outputObject>\\n</outStream>\"\n}\nIf there was a SQL query generated before the error, it will appear in the \"text\" field output stream.\n\n\nEXAMPLE INPUT CODE:\n```kotlin\nimport io.exoquery.*\nimport kotlinx.serialization.Serializable\nimport kotlinx.serialization.SerialName\n\n@Serializable\ndata class User(val id: Int, val name: String, val age: Int)\n\n@Serializable\ndata class Order(val id: Int, @SerialName(\"user_id\") val userId: Int, val total: Int)\n\nval userOrders = sql.select {\n    val u = from(Table<User>())\n    val o = join(Table<Order>()) { o -> o.userId == u.id }\n    Triple(u.name, o.total, u.age)\n}\n\nfun main() = userOrders.buildPrettyFor.Sqlite().runSample()\n```\n\nEXAMPLE INPUT SCHEMA:\n```sql\nCREATE TABLE \"User\" (id INT, name VARCHAR(100), age INT);\nCREATE TABLE \"Order\" (id INT, user_id INT, total INT);\n\nINSERT INTO \"User\" (id, name, age) VALUES\n  (1, 'Alice', 30),\n  (2, 'Bob', 25);\n\nINSERT INTO \"Order\" (id, user_id, total) VALUES\n  (1, 1, 100),\n  (2, 1, 200),\n  (3, 2, 150);\n```\n\nEXAMPLE SUCCESS OUTPUT:\n{\"sql\": \"SELECT u.name AS first, o.total AS second, u.age AS third FROM \\\"User\\\" u INNER JOIN \\\"Order\\\" o ON o.user_id = u.id\", \"output\": \"[(first=Alice, second=100, third=30), (first=Alice, second=200, third=30), (first=Bob, second=150, third=25)]\"}\n\nEXAMPLE WITH MULTIPLE OPERATIONS (insert with before/after check):\n{\"output\": \"Before:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans)]\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans), (id=2, title=Luna Fuel Flask)]\"}\n\nEXAMPLE RUNTIME ERROR (if a user divided by zero):\n{\"outputErr\": \"Exception in thread \"main\" java.lang.ArithmeticException: / by zero\"}\n\nKEY PATTERNS:\n\n(See validateExoquery for complete pattern reference)\n\nSummary of most common patterns:\n- Filter: sql { Table<T>().filter { x -> x.field == value } }\n- Select: sql.select { val x = from(Table<T>()); where { ... }; x }\n- Join: sql.select { val a = from(Table<A>()); val b = join(Table<B>()) { b -> b.aId == a.id }; Pair(a, b) }\n- Left join: joinLeft(Table<T>()) { ... } returns nullable\n- Insert: sql { insert<T> { setParams(obj).excluding(id) } }\n- Update: sql { update<T>().set { it.field to value }.where { it.id == x } }\n- Delete: sql { delete<T>().where { it.id == x } }\n\nSCHEMA RULES:\n- Table names should match data class names (case-sensitive, use quotes for exact match)\n- Column names must match @SerialName values or property names\n- Include realistic test data to verify query logic\n- Sqlite database syntax (mostly compatible with standard SQL)\n\nCOMMON PATTERNS:\n- JSON columns: Use VARCHAR for storage, @SqlJsonValue on the nested data class\n- Auto-increment IDs: Use INTEGER PRIMARY KEY\n- Nullable columns: Use Type? in Kotlin, allow NULL in schema\n",
                "inputSchema": {
                  "properties": {
                    "code": {
                      "description": "\nComplete ExoQuery Kotlin code to compile and execute.\n\nMust include:\n1. Imports (minimum: io.exoquery.*, kotlinx.serialization.Serializable)\n2. @Serializable data classes that EXACTLY match your schema tables\n3. The query expression\n4. A main() function ending with .buildFor.<Dialect>().runSample()\n    This function MUST be present to trigger SQL generation and execution.\n\nUse @SerialName(\"column_name\") when Kotlin property names differ from SQL column names.\nUse @Contextual for BigDecimal fields.\nUse @SqlJsonValue on data classes that represent JSON column values.\n\nMultiple queries in main() will produce multiple output JSON objects.\n",
                      "type": "string"
                    },
                    "databaseType": {
                      "description": "Database type: SQLITE or POSTGRES (default: SQLITE)",
                      "type": "string"
                    },
                    "schema": {
                      "description": "\nSQL schema to initialize the Sqlite test database.\n\nMust include:\n1. CREATE TABLE statements for all tables referenced in the query\n2. INSERT statements with test data to verify query behavior\n\nTable and column names must exactly match the data classes in the code.\nUse double quotes around table names to preserve case: CREATE TABLE \"User\" (...)\n\nCommon error: Table \"USER\" not found, means you wrote CREATE TABLE User but queried \"User\".\nAlways quote table names in schema to match ExoQuery's generated SQL.\n\nExample:\nCREATE TABLE \"User\" (id INT, name VARCHAR(100), age INT);\nINSERT INTO \"User\" VALUES (1, 'Alice', 30), (2, 'Bob', 25);\n",
                      "type": "string"
                    }
                  },
                  "required": [
                    "code",
                    "schema"
                  ],
                  "type": "object"
                },
                "name": "validateAndRunExoquery",
                "title": "validateAndRunExoquery"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nCompile ExoQuery Kotlin code and EXECUTE it against an Sqlite database with provided schema.\nExoQuery is a compile-time SQL query builder that translates Kotlin DSL expressions into SQL.\n\nWHEN TO USE: When you need to verify ExoQuery produces correct results against actual data.\n\nINPUT REQUIREMENTS:\n- Complete Kotlin code (same requirements as validateExoquery)\n- SQL schema with CREATE TABLE and INSERT statements for test data\n- Data classes MUST exactly match the schema table structure\n- Column names in data classes must match schema (use @SerialName for snake_case columns)\n- Must include or or more .runSample() calls in main() to trigger SQL generation and execution\n  (note that .runSample() is NOT or real production use, use .runOn(database) instead)\n  \n\nOUTPUT FORMAT:\n\nReturns one or more JSON objects, each on its own line. Each object can be:\n\n1. SQL with output (query executed successfully):\n   {\"sql\": \"SELECT u.name FROM \\\"User\\\" u\", \"output\": \"[(name=Alice), (name=Bob)]\"}\n\n2. Output only (e.g., print statements, intermediate results):\n   {\"output\": \"Before: [(id=1, title=Ion Blend Beans)]\"}\n\n3. Error output (runtime errors, exceptions):\n   {\"outputErr\": \"java.sql.SQLException: Table \\\"USERS\\\" not found\"}\n\nMultiple results appear when code has multiple queries or print statements:\n\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25)]\"}\n{\"output\": \"Before:\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"Rows affected: 1\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25), (id=2, title=Luna Fuel Flask, unit_price=89.50, in_stock=6)]\"}\n\nCompilation errors return the same format as validateExoquery:\n{\n  \"errors\": {\n    \"File.kt\": [\n      {\n        \"interval\": {\"start\": {\"line\": 12, \"ch\": 10}, \"end\": {\"line\": 12, \"ch\": 15}},\n        \"message\": \"Type mismatch: inferred type is String but Int was expected\",\n        \"severity\": \"ERROR\",\n        \"className\": \"ERROR\"\n      }\n    ]\n  }\n}\n\nRuntime Errors can have the following format:\n{\n  \"errors\" : {\n    \"File.kt\" : [ ]\n  },\n  \"exception\" : {\n    \"message\" : \"[SQLITE_ERROR] SQL error or missing database (no such table: User)\",\n    \"fullName\" : \"org.sqlite.SQLiteException\",\n    \"stackTrace\" : [ {\n      \"className\" : \"org.sqlite.core.DB\",\n      \"methodName\" : \"newSQLException\",\n      \"fileName\" : \"DB.java\",\n      \"lineNumber\" : 1179\n    }, ...]\n  },\n  \"text\" : \"<outStream><outputObject>\\n{\\\"sql\\\": \\\"SELECT x.id, x.name, x.age FROM User x\\\"}\\n</outputObject>\\n</outStream>\"\n}\nIf there was a SQL query generated before the error, it will appear in the \"text\" field output stream.\n\n\nEXAMPLE INPUT CODE:\n```kotlin\nimport io.exoquery.*\nimport kotlinx.serialization.Serializable\nimport kotlinx.serialization.SerialName\n\n@Serializable\ndata class User(val id: Int, val name: String, val age: Int)\n\n@Serializable\ndata class Order(val id: Int, @SerialName(\"user_id\") val userId: Int, val total: Int)\n\nval userOrders = sql.select {\n    val u = from(Table<User>())\n    val o = join(Table<Order>()) { o -> o.userId == u.id }\n    Triple(u.name, o.total, u.age)\n}\n\nfun main() = userOrders.buildPrettyFor.Sqlite().runSample()\n```\n\nEXAMPLE INPUT SCHEMA:\n```sql\nCREATE TABLE \"User\" (id INT, name VARCHAR(100), age INT);\nCREATE TABLE \"Order\" (id INT, user_id INT, total INT);\n\nINSERT INTO \"User\" (id, name, age) VALUES\n  (1, 'Alice', 30),\n  (2, 'Bob', 25);\n\nINSERT INTO \"Order\" (id, user_id, total) VALUES\n  (1, 1, 100),\n  (2, 1, 200),\n  (3, 2, 150);\n```\n\nEXAMPLE SUCCESS OUTPUT:\n{\"sql\": \"SELECT u.name AS first, o.total AS second, u.age AS third FROM \\\"User\\\" u INNER JOIN \\\"Order\\\" o ON o.user_id = u.id\", \"output\": \"[(first=Alice, second=100, third=30), (first=Alice, second=200, third=30), (first=Bob, second=150, third=25)]\"}\n\nEXAMPLE WITH MULTIPLE OPERATIONS (insert with before/after check):\n{\"output\": \"Before:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans)]\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans), (id=2, title=Luna Fuel Flask)]\"}\n\nEXAMPLE RUNTIME ERROR (if a user divided by zero):\n{\"outputErr\": \"Exception in thread \"main\" java.lang.ArithmeticException: / by zero\"}\n\nKEY PATTERNS:\n\n(See validateExoquery for complete pattern reference)\n\nSummary of most common patterns:\n- Filter: sql { Table<T>().filter { x -> x.field == value } }\n- Select: sql.select { val x = from(Table<T>()); where { ... }; x }\n- Join: sql.select { val a = from(Table<A>()); val b = join(Table<B>()) { b -> b.aId == a.id }; Pair(a, b) }\n- Left join: joinLeft(Table<T>()) { ... } returns nullable\n- Insert: sql { insert<T> { setParams(obj).excluding(id) } }\n- Update: sql { update<T>().set { it.field to value }.where { it.id == x } }\n- Delete: sql { delete<T>().where { it.id == x } }\n\nSCHEMA RULES:\n- Table names should match data class names (case-sensitive, use quotes for exact match)\n- Column names must match @SerialName values or property names\n- Include realistic test data to verify query logic\n- Sqlite database syntax (mostly compatible with standard SQL)\n\nCOMMON PATTERNS:\n- JSON columns: Use VARCHAR for storage, @SqlJsonValue on the nested data class\n- Auto-increment IDs: Use INTEGER PRIMARY KEY\n- Nullable columns: Use Type? in Kotlin, allow NULL in schema\n",
                "inputSchema": {
                  "properties": {
                    "code": {
                      "description": "\nComplete ExoQuery Kotlin code to compile.\n\nMust include:\n1. Imports (minimum: io.exoquery.*, kotlinx.serialization.Serializable)\n2. @Serializable data classes matching your query entities\n3. The query expression using sql { ... } or sql.select { ... }\n4. A main() function ending with .buildFor.<Dialect>().runSample() or .buildPrettyFor.<Dialect>().runSample()\n   This function MUST be present to trigger SQL generation.\n\nThe runSample() function triggers SQL generation but does NOT execute the query for validateExoquery.\n(Note that this is NOT for production ExoQuery usage. For that you use `.runOn(database)`.)\n\nDialect is part of the code (e.g., .buildFor.Postgres()), NOT a separate parameter.\n\nIf compilation fails, check the error interval positions to locate the exact issue in your code.\n",
                      "type": "string"
                    }
                  },
                  "required": [
                    "code"
                  ],
                  "type": "object"
                },
                "name": "validateExoquery",
                "title": "validateExoquery"
              }
            ]
          }
        },
        "url": "https://backend.exoquery.com/mcp"
      },
      "latency_ms": 40.5,
      "status": "ok"
    },
    "transport_compliance_probe": {
      "details": {
        "bad_protocol_error": null,
        "bad_protocol_headers": {
          "content-type": "application/json",
          "strict-transport-security": "max-age=31536000 ; includeSubDomains"
        },
        "bad_protocol_payload": {
          "id": 410,
          "jsonrpc": "2.0",
          "result": {
            "tools": [
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nAccess comprehensive ExoQuery documentation organized by topic and category.\n\nExoQuery is a Language Integrated Query library for Kotlin Multiplatform that translates Kotlin DSL expressions into SQL at compile time. This resource provides access to the complete documentation covering all aspects of the library.\n\nAVAILABLE DOCUMENTATION CATEGORIES:\n\n1. **Getting Started**\n   - Introduction: What ExoQuery is and why it exists\n   - Installation: Project setup and dependencies\n   - Quick Start: First query in minutes\n\n2. **Core Concepts**\n   - SQL Blocks: The sql { } construct and query building\n   - Parameters: Safe runtime data handling\n   - Composing Queries: Functional query composition\n\n3. **Query Operations**\n   - Basic Operations: Map, filter, and transformations\n   - Joins: Inner, left, and implicit joins\n   - Grouping: GROUP BY and HAVING clauses\n   - Sorting: ORDER BY operations\n   - Subqueries: Correlated and nested queries\n   - Window Functions: Advanced analytics\n\n4. **Actions**\n   - Insert: INSERT with returning and conflict handling\n   - Update: UPDATE operations with setParams\n   - Delete: DELETE with returning\n   - Batch Operations: Bulk inserts and updates\n\n5. **Advanced Features**\n   - SQL Fragment Functions: Reusable SQL components with @SqlFragment\n   - Dynamic Queries: Runtime query generation with @SqlDynamic\n   - Free Blocks: Custom SQL and user-defined functions\n   - Transactions: Transaction support patterns\n   - Polymorphic Queries: Interfaces, sealed classes, higher-order functions\n   - Local Variables: Variables within SQL blocks\n\n6. **Data Handling**\n   - Serialization: kotlinx.serialization integration\n   - Custom Type Encoding: Custom encoders and decoders\n   - JSON Columns: JSON and JSONB support (PostgreSQL)\n   - Column Naming: @SerialName and @ExoEntity annotations\n   - Nested Datatypes: Complex data structures\n   - Kotlinx Integration: JSON and other serialization formats\n\n7. **Schema-First Development**\n   - Entity Generation: Compile-time code generation from database schema\n   - AI-Enhanced Entities: Using LLMs to generate cleaner entity code\n\n8. **Reference**\n   - SQL Functions: Available string, math, and date functions\n   - API Reference: Core types and function signatures\n\nHOW TO USE THIS RESOURCE:\n\nThe resource URI follows the pattern:\n  exoquery://docs/{file-path}\n\nWhere {file-path} is the relative path from the docs root, e.g.:\n  - exoquery://docs/01-getting-started/01-introduction.md\n  - exoquery://docs/03-query-operations/02-joins.md\n  - exoquery://docs/05-advanced-features/01-sql-fragments.md\n\nTo discover available documents, use the MCP resources/list endpoint which will return all available documentation files with their titles, descriptions, and categories.\n\nEach document includes:\n- Title and description\n- Category classification\n- Complete markdown content with code examples\n- Cross-references to related topics\n\nWHEN TO USE:\n- User asks about ExoQuery syntax, features, or capabilities\n- User needs examples of specific query patterns\n- User encounters errors and needs to verify correct usage\n- User wants to understand advanced features or best practices\n",
                "inputSchema": {
                  "properties": {
                    "filePath": {
                      "description": "\nThe documentation file path to retrieve.\n\nFormat: Relative path from docs root (e.g., \"01-getting-started/01-introduction.md\")\n\nThe full URI is: exoquery://docs/{file-path}\n\nTo find available file paths, use the MCP resources/list endpoint which returns metadata for all documentation files including their paths, titles, categories, and descriptions.\n\nCommon paths:\n- Getting Started: 01-getting-started/01-introduction.md, 01-getting-started/02-installation.md, 01-getting-started/03-quick-start.md\n- Core Concepts: 02-core-concepts/01-sql-blocks.md, 02-core-concepts/02-parameters.md, 02-core-concepts/03-composing-queries.md\n- Query Operations: 03-query-operations/01-basic-operations.md, 03-query-operations/02-joins.md, 03-query-operations/03-grouping.md\n- Actions: 04-actions/01-insert.md, 04-actions/02-update.md, 04-actions/03-delete.md\n- Advanced: 05-advanced-features/01-sql-fragments.md, 05-advanced-features/02-dynamic-queries.md\n- Data Handling: 06-data-handling/03-json-columns.md, 06-data-handling/04-column-naming.md\n",
                      "type": "string"
                    }
                  },
                  "required": [
                    "filePath"
                  ],
                  "type": "object"
                },
                "name": "getExoQueryDocs",
                "title": "getExoQueryDocs"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nAccess multiple ExoQuery documentation sections simultaneously.\n\nThis tool is similar to the single-document retrieval tool but allows fetching multiple documentation files in a single request. This is particularly useful when you need to gather information from several related topics at once.\n\nExoQuery is a Language Integrated Query library for Kotlin Multiplatform that translates Kotlin DSL expressions into SQL at compile time. This resource provides access to the complete documentation covering all aspects of the library.\n\nHOW TO USE THIS RESOURCE:\n\nProvide a list of file paths, where each path is the relative path from the docs root, e.g.:\n  - 01-getting-started/01-introduction.md\n  - 03-query-operations/02-joins.md\n  - 05-advanced-features/01-sql-fragments.md\n\nTo discover available documents, use the MCP resources/list endpoint which will return all available documentation files with their titles, descriptions, and categories.\n\nEach returned document includes:\n- Title and description\n- Category classification\n- Complete markdown content with code examples\n- Cross-references to related topics\n\nWHEN TO USE:\n- User asks about multiple ExoQuery topics that require information from different sections\n- User needs to compare or understand relationships between different features\n- User wants to get comprehensive information across multiple categories\n- More efficient than making multiple single-document requests\n",
                "inputSchema": {
                  "properties": {
                    "filePaths": {
                      "description": "\nA list of documentation file paths to retrieve.\n\nFormat: List of relative paths from docs root (e.g., [\"01-getting-started/01-introduction.md\", \"03-query-operations/02-joins.md\"])\n\nEach path follows the pattern used in single-document retrieval: {category-folder}/{file-name}.md\n\nTo find available file paths, use the MCP resources/list endpoint which returns metadata for all documentation files including their paths, titles, categories, and descriptions.\n\nCommon paths:\n- Getting Started: 01-getting-started/01-introduction.md, 01-getting-started/02-installation.md, 01-getting-started/03-quick-start.md\n- Core Concepts: 02-core-concepts/01-sql-blocks.md, 02-core-concepts/02-parameters.md, 02-core-concepts/03-composing-queries.md\n- Query Operations: 03-query-operations/01-basic-operations.md, 03-query-operations/02-joins.md, 03-query-operations/03-grouping.md\n- Actions: 04-actions/01-insert.md, 04-actions/02-update.md, 04-actions/03-delete.md\n- Advanced: 05-advanced-features/01-sql-fragments.md, 05-advanced-features/02-dynamic-queries.md\n- Data Handling: 06-data-handling/03-json-columns.md, 06-data-handling/04-column-naming.md\n",
                      "items": {
                        "type": "string"
                      },
                      "type": "array"
                    }
                  },
                  "required": [
                    "filePaths"
                  ],
                  "type": "object"
                },
                "name": "getExoQueryDocsMulti",
                "title": "getExoQueryDocsMulti"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "Lists all available ExoQuery documentation resources with their metadata",
                "inputSchema": {
                  "properties": {},
                  "required": [],
                  "type": "object"
                },
                "name": "listExoQueryDocs",
                "title": "listExoQueryDocs"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nExecute raw, client-provided SQL queries against an ephemeral database initialized with the provided schema.\nReturns query results in a simple JSON format with column headers and row data as a 2D array.\n\nThe database type (SQLite or Postgres) is specified via the databaseType parameter:\n- SQLITE: In-memory, lightweight, uses standard SQLite syntax\n- POSTGRES: Temporary isolated schema with dedicated user, uses PostgreSQL syntax and features\n\nWHEN TO USE: When you need to run your own hand-written SQL queries to test database behavior or\ncompare the output with ExoQuery results from validateAndRunExoquery. This lets you verify that\nExoQuery-generated SQL produces the same results as your expected SQL.\n\nINPUT REQUIREMENTS:\n- query: A valid SQL query (SELECT, INSERT, UPDATE, DELETE, etc.)\n- schema: SQL schema with CREATE TABLE and INSERT statements to initialize the test database\n- databaseType: Either \"SQLITE\" or \"POSTGRES\" (defaults to SQLITE if not specified)\n\nOUTPUT FORMAT:\n\nOn success, returns JSON with the SQL query and a 2D array of results:\n{\"sql\":\"SELECT * FROM users ORDER BY id\",\"output\":[[\"id\",\"name\",\"age\"],[\"1\",\"Alice\",\"30\"],[\"2\",\"Bob\",\"25\"],[\"3\",\"Charlie\",\"35\"]]}\n\nOutput format details:\n- First array element contains column headers\n- Subsequent array elements contain row data\n- All values are returned as strings\n\nOn error, returns JSON with error message and the attempted query (if available):\n{\"error\":\"Query execution failed: no such table: USERS\",\"sql\":\"SELECT * FROM USERS\"}\n\nOr if schema initialization fails:\n{\"error\":\"Database initialization failed due to: near \\\"CREAT\\\": syntax error\\\\nWhen executing the following statement:\\\\n--------\\\\nCREAT TABLE users ...\\\\n--------\",\"sql\":\"CREAT TABLE users ...\"}\n\nEXAMPLE INPUT:\n\nQuery:\nSELECT * FROM users ORDER BY id\n\nSchema:\nCREATE TABLE users (\n  id INTEGER PRIMARY KEY,\n  name TEXT NOT NULL,\n  age INTEGER\n);\n\nINSERT INTO users (id, name, age) VALUES (1, 'Alice', 30);\nINSERT INTO users (id, name, age) VALUES (2, 'Bob', 25);\nINSERT INTO users (id, name, age) VALUES (3, 'Charlie', 35);\n\nEXAMPLE SUCCESS OUTPUT:\n{\"sql\":\"SELECT * FROM users ORDER BY id\",\"output\":[[\"id\",\"name\",\"age\"],[\"1\",\"Alice\",\"30\"],[\"2\",\"Bob\",\"25\"],[\"3\",\"Charlie\",\"35\"]]}\n\nEXAMPLE ERROR OUTPUT (bad table name):\n{\"error\":\"Query execution failed: no such table: invalid_table\",\"sql\":\"SELECT * FROM invalid_table\"}\n\nEXAMPLE ERROR OUTPUT (bad schema):\n{\"error\":\"Database initialization failed due to: near \\\"CREAT\\\": syntax error\\\\nWhen executing the following statement:\\\\n--------\\\\nCREAT TABLE users (id INTEGER)\\\\n--------\\\\nCheck that the initialization SQL is valid and compatible with SQLite.\",\"sql\":\"CREAT TABLE users (id INTEGER)\"}\n\nCOMMON QUERY EXAMPLES:\n\nSelect all rows:\nSELECT * FROM users\n\nSelect specific columns with filtering:\nSELECT name, age FROM users WHERE age > 25\n\nAggregate functions:\nSELECT COUNT(*) as total FROM users\n\nJoin queries:\nSELECT u.name, o.total FROM users u JOIN orders o ON u.id = o.user_id\n\nInsert data:\nINSERT INTO users (name, age) VALUES ('David', 40)\n\nUpdate data:\nUPDATE users SET age = 31 WHERE name = 'Alice'\n\nDelete data:\nDELETE FROM users WHERE age < 25\n\nCount with grouping:\nSELECT age, COUNT(*) as count FROM users GROUP BY age\n\nSCHEMA RULES:\n- Use standard SQLite syntax\n- Table names are case-sensitive (use lowercase for simplicity or quote names)\n- Include INSERT statements to populate test data for meaningful results\n- Supported data types: INTEGER, TEXT, REAL, BLOB, NULL\n- Use INTEGER PRIMARY KEY for auto-increment columns\n- Schema SQL is split on semicolons (;), so each statement after a ';' is executed separately\n- Avoid semicolons in comments as they will cause statement parsing issues\n\nCOMPARISON WITH EXOQUERY:\nThis tool is designed to work alongside validateAndRunExoquery for comparison purposes:\n1. Use validateAndRunExoquery to run ExoQuery Kotlin code and see the generated SQL + results\n2. Use runRawSql with your own hand-written SQL to verify you get the same output\n3. Compare the outputs to ensure ExoQuery generates the SQL you expect\n4. Test edge cases with plain SQL before writing equivalent ExoQuery code\n",
                "inputSchema": {
                  "properties": {
                    "query": {
                      "description": "\nA valid SQL query to execute against the database.\n\nCan be any valid SQL statement (syntax depends on databaseType parameter):\n- SELECT queries (with WHERE, JOIN, GROUP BY, ORDER BY, LIMIT, etc.)\n- INSERT statements\n- UPDATE statements\n- DELETE statements\n- DDL statements like CREATE/ALTER/DROP (applied after schema initialization)\n\nThe query will be executed against a database initialized with the provided schema parameter.\n\nExample:\nSELECT * FROM users WHERE age > 25 ORDER BY name\n",
                      "type": "string"
                    },
                    "schema": {
                      "description": "\nSQL schema to initialize the ephemeral test database.\n\nMust include:\n1. CREATE TABLE statements for all tables used in the query\n2. INSERT statements with test data\n\nUse syntax appropriate for the selected databaseType (SQLite or Postgres).\nTable names are case-sensitive. The schema is split on semicolons, so each statement is executed separately.\n\nExample:\nCREATE TABLE users (\n  id INTEGER PRIMARY KEY,\n  name TEXT NOT NULL,\n  age INTEGER\n);\n\nINSERT INTO users (id, name, age) VALUES (1, 'Alice', 30);\nINSERT INTO users (id, name, age) VALUES (2, 'Bob', 25);\nINSERT INTO users (id, name, age) VALUES (3, 'Charlie', 35);\n",
                      "type": "string"
                    }
                  },
                  "required": [
                    "query",
                    "schema"
                  ],
                  "type": "object"
                },
                "name": "runRawSql",
                "title": "runRawSql"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nCompile ExoQuery Kotlin code and EXECUTE it against an Sqlite database with provided schema.\nExoQuery is a compile-time SQL query builder that translates Kotlin DSL expressions into SQL.\n\nWHEN TO USE: When you need to verify ExoQuery produces correct results against actual data.\n\nINPUT REQUIREMENTS:\n- Complete Kotlin code (same requirements as validateExoquery)\n- SQL schema with CREATE TABLE and INSERT statements for test data\n- Data classes MUST exactly match the schema table structure\n- Column names in data classes must match schema (use @SerialName for snake_case columns)\n- Must include or or more .runSample() calls in main() to trigger SQL generation and execution\n  (note that .runSample() is NOT or real production use, use .runOn(database) instead)\n  \n\nOUTPUT FORMAT:\n\nReturns one or more JSON objects, each on its own line. Each object can be:\n\n1. SQL with output (query executed successfully):\n   {\"sql\": \"SELECT u.name FROM \\\"User\\\" u\", \"output\": \"[(name=Alice), (name=Bob)]\"}\n\n2. Output only (e.g., print statements, intermediate results):\n   {\"output\": \"Before: [(id=1, title=Ion Blend Beans)]\"}\n\n3. Error output (runtime errors, exceptions):\n   {\"outputErr\": \"java.sql.SQLException: Table \\\"USERS\\\" not found\"}\n\nMultiple results appear when code has multiple queries or print statements:\n\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25)]\"}\n{\"output\": \"Before:\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"Rows affected: 1\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25), (id=2, title=Luna Fuel Flask, unit_price=89.50, in_stock=6)]\"}\n\nCompilation errors return the same format as validateExoquery:\n{\n  \"errors\": {\n    \"File.kt\": [\n      {\n        \"interval\": {\"start\": {\"line\": 12, \"ch\": 10}, \"end\": {\"line\": 12, \"ch\": 15}},\n        \"message\": \"Type mismatch: inferred type is String but Int was expected\",\n        \"severity\": \"ERROR\",\n        \"className\": \"ERROR\"\n      }\n    ]\n  }\n}\n\nRuntime Errors can have the following format:\n{\n  \"errors\" : {\n    \"File.kt\" : [ ]\n  },\n  \"exception\" : {\n    \"message\" : \"[SQLITE_ERROR] SQL error or missing database (no such table: User)\",\n    \"fullName\" : \"org.sqlite.SQLiteException\",\n    \"stackTrace\" : [ {\n      \"className\" : \"org.sqlite.core.DB\",\n      \"methodName\" : \"newSQLException\",\n      \"fileName\" : \"DB.java\",\n      \"lineNumber\" : 1179\n    }, ...]\n  },\n  \"text\" : \"<outStream><outputObject>\\n{\\\"sql\\\": \\\"SELECT x.id, x.name, x.age FROM User x\\\"}\\n</outputObject>\\n</outStream>\"\n}\nIf there was a SQL query generated before the error, it will appear in the \"text\" field output stream.\n\n\nEXAMPLE INPUT CODE:\n```kotlin\nimport io.exoquery.*\nimport kotlinx.serialization.Serializable\nimport kotlinx.serialization.SerialName\n\n@Serializable\ndata class User(val id: Int, val name: String, val age: Int)\n\n@Serializable\ndata class Order(val id: Int, @SerialName(\"user_id\") val userId: Int, val total: Int)\n\nval userOrders = sql.select {\n    val u = from(Table<User>())\n    val o = join(Table<Order>()) { o -> o.userId == u.id }\n    Triple(u.name, o.total, u.age)\n}\n\nfun main() = userOrders.buildPrettyFor.Sqlite().runSample()\n```\n\nEXAMPLE INPUT SCHEMA:\n```sql\nCREATE TABLE \"User\" (id INT, name VARCHAR(100), age INT);\nCREATE TABLE \"Order\" (id INT, user_id INT, total INT);\n\nINSERT INTO \"User\" (id, name, age) VALUES\n  (1, 'Alice', 30),\n  (2, 'Bob', 25);\n\nINSERT INTO \"Order\" (id, user_id, total) VALUES\n  (1, 1, 100),\n  (2, 1, 200),\n  (3, 2, 150);\n```\n\nEXAMPLE SUCCESS OUTPUT:\n{\"sql\": \"SELECT u.name AS first, o.total AS second, u.age AS third FROM \\\"User\\\" u INNER JOIN \\\"Order\\\" o ON o.user_id = u.id\", \"output\": \"[(first=Alice, second=100, third=30), (first=Alice, second=200, third=30), (first=Bob, second=150, third=25)]\"}\n\nEXAMPLE WITH MULTIPLE OPERATIONS (insert with before/after check):\n{\"output\": \"Before:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans)]\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans), (id=2, title=Luna Fuel Flask)]\"}\n\nEXAMPLE RUNTIME ERROR (if a user divided by zero):\n{\"outputErr\": \"Exception in thread \"main\" java.lang.ArithmeticException: / by zero\"}\n\nKEY PATTERNS:\n\n(See validateExoquery for complete pattern reference)\n\nSummary of most common patterns:\n- Filter: sql { Table<T>().filter { x -> x.field == value } }\n- Select: sql.select { val x = from(Table<T>()); where { ... }; x }\n- Join: sql.select { val a = from(Table<A>()); val b = join(Table<B>()) { b -> b.aId == a.id }; Pair(a, b) }\n- Left join: joinLeft(Table<T>()) { ... } returns nullable\n- Insert: sql { insert<T> { setParams(obj).excluding(id) } }\n- Update: sql { update<T>().set { it.field to value }.where { it.id == x } }\n- Delete: sql { delete<T>().where { it.id == x } }\n\nSCHEMA RULES:\n- Table names should match data class names (case-sensitive, use quotes for exact match)\n- Column names must match @SerialName values or property names\n- Include realistic test data to verify query logic\n- Sqlite database syntax (mostly compatible with standard SQL)\n\nCOMMON PATTERNS:\n- JSON columns: Use VARCHAR for storage, @SqlJsonValue on the nested data class\n- Auto-increment IDs: Use INTEGER PRIMARY KEY\n- Nullable columns: Use Type? in Kotlin, allow NULL in schema\n",
                "inputSchema": {
                  "properties": {
                    "code": {
                      "description": "\nComplete ExoQuery Kotlin code to compile and execute.\n\nMust include:\n1. Imports (minimum: io.exoquery.*, kotlinx.serialization.Serializable)\n2. @Serializable data classes that EXACTLY match your schema tables\n3. The query expression\n4. A main() function ending with .buildFor.<Dialect>().runSample()\n    This function MUST be present to trigger SQL generation and execution.\n\nUse @SerialName(\"column_name\") when Kotlin property names differ from SQL column names.\nUse @Contextual for BigDecimal fields.\nUse @SqlJsonValue on data classes that represent JSON column values.\n\nMultiple queries in main() will produce multiple output JSON objects.\n",
                      "type": "string"
                    },
                    "databaseType": {
                      "description": "Database type: SQLITE or POSTGRES (default: SQLITE)",
                      "type": "string"
                    },
                    "schema": {
                      "description": "\nSQL schema to initialize the Sqlite test database.\n\nMust include:\n1. CREATE TABLE statements for all tables referenced in the query\n2. INSERT statements with test data to verify query behavior\n\nTable and column names must exactly match the data classes in the code.\nUse double quotes around table names to preserve case: CREATE TABLE \"User\" (...)\n\nCommon error: Table \"USER\" not found, means you wrote CREATE TABLE User but queried \"User\".\nAlways quote table names in schema to match ExoQuery's generated SQL.\n\nExample:\nCREATE TABLE \"User\" (id INT, name VARCHAR(100), age INT);\nINSERT INTO \"User\" VALUES (1, 'Alice', 30), (2, 'Bob', 25);\n",
                      "type": "string"
                    }
                  },
                  "required": [
                    "code",
                    "schema"
                  ],
                  "type": "object"
                },
                "name": "validateAndRunExoquery",
                "title": "validateAndRunExoquery"
              },
              {
                "annotations": {
                  "destructiveHint": true,
                  "idempotentHint": false,
                  "openWorldHint": true,
                  "readOnlyHint": false,
                  "title": ""
                },
                "description": "\nCompile ExoQuery Kotlin code and EXECUTE it against an Sqlite database with provided schema.\nExoQuery is a compile-time SQL query builder that translates Kotlin DSL expressions into SQL.\n\nWHEN TO USE: When you need to verify ExoQuery produces correct results against actual data.\n\nINPUT REQUIREMENTS:\n- Complete Kotlin code (same requirements as validateExoquery)\n- SQL schema with CREATE TABLE and INSERT statements for test data\n- Data classes MUST exactly match the schema table structure\n- Column names in data classes must match schema (use @SerialName for snake_case columns)\n- Must include or or more .runSample() calls in main() to trigger SQL generation and execution\n  (note that .runSample() is NOT or real production use, use .runOn(database) instead)\n  \n\nOUTPUT FORMAT:\n\nReturns one or more JSON objects, each on its own line. Each object can be:\n\n1. SQL with output (query executed successfully):\n   {\"sql\": \"SELECT u.name FROM \\\"User\\\" u\", \"output\": \"[(name=Alice), (name=Bob)]\"}\n\n2. Output only (e.g., print statements, intermediate results):\n   {\"output\": \"Before: [(id=1, title=Ion Blend Beans)]\"}\n\n3. Error output (runtime errors, exceptions):\n   {\"outputErr\": \"java.sql.SQLException: Table \\\"USERS\\\" not found\"}\n\nMultiple results appear when code has multiple queries or print statements:\n\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25)]\"}\n{\"output\": \"Before:\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"Rows affected: 1\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25), (id=2, title=Luna Fuel Flask, unit_price=89.50, in_stock=6)]\"}\n\nCompilation errors return the same format as validateExoquery:\n{\n  \"errors\": {\n    \"File.kt\": [\n      {\n        \"interval\": {\"start\": {\"line\": 12, \"ch\": 10}, \"end\": {\"line\": 12, \"ch\": 15}},\n        \"message\": \"Type mismatch: inferred type is String but Int was expected\",\n        \"severity\": \"ERROR\",\n        \"className\": \"ERROR\"\n      }\n    ]\n  }\n}\n\nRuntime Errors can have the following format:\n{\n  \"errors\" : {\n    \"File.kt\" : [ ]\n  },\n  \"exception\" : {\n    \"message\" : \"[SQLITE_ERROR] SQL error or missing database (no such table: User)\",\n    \"fullName\" : \"org.sqlite.SQLiteException\",\n    \"stackTrace\" : [ {\n      \"className\" : \"org.sqlite.core.DB\",\n      \"methodName\" : \"newSQLException\",\n      \"fileName\" : \"DB.java\",\n      \"lineNumber\" : 1179\n    }, ...]\n  },\n  \"text\" : \"<outStream><outputObject>\\n{\\\"sql\\\": \\\"SELECT x.id, x.name, x.age FROM User x\\\"}\\n</outputObject>\\n</outStream>\"\n}\nIf there was a SQL query generated before the error, it will appear in the \"text\" field output stream.\n\n\nEXAMPLE INPUT CODE:\n```kotlin\nimport io.exoquery.*\nimport kotlinx.serialization.Serializable\nimport kotlinx.serialization.SerialName\n\n@Serializable\ndata class User(val id: Int, val name: String, val age: Int)\n\n@Serializable\ndata class Order(val id: Int, @SerialName(\"user_id\") val userId: Int, val total: Int)\n\nval userOrders = sql.select {\n    val u = from(Table<User>())\n    val o = join(Table<Order>()) { o -> o.userId == u.id }\n    Triple(u.name, o.total, u.age)\n}\n\nfun main() = userOrders.buildPrettyFor.Sqlite().runSample()\n```\n\nEXAMPLE INPUT SCHEMA:\n```sql\nCREATE TABLE \"User\" (id INT, name VARCHAR(100), age INT);\nCREATE TABLE \"Order\" (id INT, user_id INT, total INT);\n\nINSERT INTO \"User\" (id, name, age) VALUES\n  (1, 'Alice', 30),\n  (2, 'Bob', 25);\n\nINSERT INTO \"Order\" (id, user_id, total) VALUES\n  (1, 1, 100),\n  (2, 1, 200),\n  (3, 2, 150);\n```\n\nEXAMPLE SUCCESS OUTPUT:\n{\"sql\": \"SELECT u.name AS first, o.total AS second, u.age AS third FROM \\\"User\\\" u INNER JOIN \\\"Order\\\" o ON o.user_id = u.id\", \"output\": \"[(first=Alice, second=100, third=30), (first=Alice, second=200, third=30), (first=Bob, second=150, third=25)]\"}\n\nEXAMPLE WITH MULTIPLE OPERATIONS (insert with before/after check):\n{\"output\": \"Before:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans)]\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans), (id=2, title=Luna Fuel Flask)]\"}\n\nEXAMPLE RUNTIME ERROR (if a user divided by zero):\n{\"outputErr\": \"Exception in thread \"main\" java.lang.ArithmeticException: / by zero\"}\n\nKEY PATTERNS:\n\n(See validateExoquery for complete pattern reference)\n\nSummary of most common patterns:\n- Filter: sql { Table<T>().filter { x -> x.field == value } }\n- Select: sql.select { val x = from(Table<T>()); where { ... }; x }\n- Join: sql.select { val a = from(Table<A>()); val b = join(Table<B>()) { b -> b.aId == a.id }; Pair(a, b) }\n- Left join: joinLeft(Table<T>()) { ... } returns nullable\n- Insert: sql { insert<T> { setParams(obj).excluding(id) } }\n- Update: sql { update<T>().set { it.field to value }.where { it.id == x } }\n- Delete: sql { delete<T>().where { it.id == x } }\n\nSCHEMA RULES:\n- Table names should match data class names (case-sensitive, use quotes for exact match)\n- Column names must match @SerialName values or property names\n- Include realistic test data to verify query logic\n- Sqlite database syntax (mostly compatible with standard SQL)\n\nCOMMON PATTERNS:\n- JSON columns: Use VARCHAR for storage, @SqlJsonValue on the nested data class\n- Auto-increment IDs: Use INTEGER PRIMARY KEY\n- Nullable columns: Use Type? in Kotlin, allow NULL in schema\n",
                "inputSchema": {
                  "properties": {
                    "code": {
                      "description": "\nComplete ExoQuery Kotlin code to compile.\n\nMust include:\n1. Imports (minimum: io.exoquery.*, kotlinx.serialization.Serializable)\n2. @Serializable data classes matching your query entities\n3. The query expression using sql { ... } or sql.select { ... }\n4. A main() function ending with .buildFor.<Dialect>().runSample() or .buildPrettyFor.<Dialect>().runSample()\n   This function MUST be present to trigger SQL generation.\n\nThe runSample() function triggers SQL generation but does NOT execute the query for validateExoquery.\n(Note that this is NOT for production ExoQuery usage. For that you use `.runOn(database)`.)\n\nDialect is part of the code (e.g., .buildFor.Postgres()), NOT a separate parameter.\n\nIf compilation fails, check the error interval positions to locate the exact issue in your code.\n",
                      "type": "string"
                    }
                  },
                  "required": [
                    "code"
                  ],
                  "type": "object"
                },
                "name": "validateExoquery",
                "title": "validateExoquery"
              }
            ]
          }
        },
        "bad_protocol_status_code": 200,
        "delete_error": null,
        "delete_status_code": null,
        "expired_session_error": null,
        "expired_session_status_code": null,
        "issues": [
          "missing_session_id",
          "missing_protocol_header",
          "bad_protocol_not_rejected"
        ],
        "last_event_id_visible": false,
        "protocol_header_present": false,
        "requested_protocol_version": "2025-03-26",
        "session_id_present": false,
        "transport": "streamable-http"
      },
      "latency_ms": 43.37,
      "status": "error"
    },
    "utility_coverage_probe": {
      "details": {
        "completions": {
          "advertised": true,
          "live_probe": "not_executed",
          "sample_target": null
        },
        "initialize_capability_keys": [
          "completions",
          "prompts",
          "resources",
          "tools"
        ],
        "pagination": {
          "metadata_signal": false,
          "next_cursor_methods": [],
          "supported": false
        },
        "tasks": {
          "advertised": false,
          "http_status": 500,
          "probe_status": "missing"
        }
      },
      "latency_ms": 162.49,
      "status": "warning"
    }
  },
  "failures": {
    "openid_configuration": {
      "error": "Client error '403 Forbidden' for url 'https://backend.exoquery.com/.well-known/openid-configuration'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403",
      "url": "https://backend.exoquery.com/.well-known/openid-configuration"
    },
    "probe_noise_resilience": {
      "headers": {
        "content-type": "application/json"
      },
      "http_status": 403,
      "url": "https://backend.exoquery.com/robots.txt"
    },
    "server_card": {
      "error": "Client error '403 Forbidden' for url 'https://backend.exoquery.com/.well-known/mcp/server-card.json'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403",
      "url": "https://backend.exoquery.com/.well-known/mcp/server-card.json"
    },
    "transport_compliance_probe": {
      "bad_protocol_error": null,
      "bad_protocol_headers": {
        "content-type": "application/json",
        "strict-transport-security": "max-age=31536000 ; includeSubDomains"
      },
      "bad_protocol_payload": {
        "id": 410,
        "jsonrpc": "2.0",
        "result": {
          "tools": [
            {
              "annotations": {
                "destructiveHint": true,
                "idempotentHint": false,
                "openWorldHint": true,
                "readOnlyHint": false,
                "title": ""
              },
              "description": "\nAccess comprehensive ExoQuery documentation organized by topic and category.\n\nExoQuery is a Language Integrated Query library for Kotlin Multiplatform that translates Kotlin DSL expressions into SQL at compile time. This resource provides access to the complete documentation covering all aspects of the library.\n\nAVAILABLE DOCUMENTATION CATEGORIES:\n\n1. **Getting Started**\n   - Introduction: What ExoQuery is and why it exists\n   - Installation: Project setup and dependencies\n   - Quick Start: First query in minutes\n\n2. **Core Concepts**\n   - SQL Blocks: The sql { } construct and query building\n   - Parameters: Safe runtime data handling\n   - Composing Queries: Functional query composition\n\n3. **Query Operations**\n   - Basic Operations: Map, filter, and transformations\n   - Joins: Inner, left, and implicit joins\n   - Grouping: GROUP BY and HAVING clauses\n   - Sorting: ORDER BY operations\n   - Subqueries: Correlated and nested queries\n   - Window Functions: Advanced analytics\n\n4. **Actions**\n   - Insert: INSERT with returning and conflict handling\n   - Update: UPDATE operations with setParams\n   - Delete: DELETE with returning\n   - Batch Operations: Bulk inserts and updates\n\n5. **Advanced Features**\n   - SQL Fragment Functions: Reusable SQL components with @SqlFragment\n   - Dynamic Queries: Runtime query generation with @SqlDynamic\n   - Free Blocks: Custom SQL and user-defined functions\n   - Transactions: Transaction support patterns\n   - Polymorphic Queries: Interfaces, sealed classes, higher-order functions\n   - Local Variables: Variables within SQL blocks\n\n6. **Data Handling**\n   - Serialization: kotlinx.serialization integration\n   - Custom Type Encoding: Custom encoders and decoders\n   - JSON Columns: JSON and JSONB support (PostgreSQL)\n   - Column Naming: @SerialName and @ExoEntity annotations\n   - Nested Datatypes: Complex data structures\n   - Kotlinx Integration: JSON and other serialization formats\n\n7. **Schema-First Development**\n   - Entity Generation: Compile-time code generation from database schema\n   - AI-Enhanced Entities: Using LLMs to generate cleaner entity code\n\n8. **Reference**\n   - SQL Functions: Available string, math, and date functions\n   - API Reference: Core types and function signatures\n\nHOW TO USE THIS RESOURCE:\n\nThe resource URI follows the pattern:\n  exoquery://docs/{file-path}\n\nWhere {file-path} is the relative path from the docs root, e.g.:\n  - exoquery://docs/01-getting-started/01-introduction.md\n  - exoquery://docs/03-query-operations/02-joins.md\n  - exoquery://docs/05-advanced-features/01-sql-fragments.md\n\nTo discover available documents, use the MCP resources/list endpoint which will return all available documentation files with their titles, descriptions, and categories.\n\nEach document includes:\n- Title and description\n- Category classification\n- Complete markdown content with code examples\n- Cross-references to related topics\n\nWHEN TO USE:\n- User asks about ExoQuery syntax, features, or capabilities\n- User needs examples of specific query patterns\n- User encounters errors and needs to verify correct usage\n- User wants to understand advanced features or best practices\n",
              "inputSchema": {
                "properties": {
                  "filePath": {
                    "description": "\nThe documentation file path to retrieve.\n\nFormat: Relative path from docs root (e.g., \"01-getting-started/01-introduction.md\")\n\nThe full URI is: exoquery://docs/{file-path}\n\nTo find available file paths, use the MCP resources/list endpoint which returns metadata for all documentation files including their paths, titles, categories, and descriptions.\n\nCommon paths:\n- Getting Started: 01-getting-started/01-introduction.md, 01-getting-started/02-installation.md, 01-getting-started/03-quick-start.md\n- Core Concepts: 02-core-concepts/01-sql-blocks.md, 02-core-concepts/02-parameters.md, 02-core-concepts/03-composing-queries.md\n- Query Operations: 03-query-operations/01-basic-operations.md, 03-query-operations/02-joins.md, 03-query-operations/03-grouping.md\n- Actions: 04-actions/01-insert.md, 04-actions/02-update.md, 04-actions/03-delete.md\n- Advanced: 05-advanced-features/01-sql-fragments.md, 05-advanced-features/02-dynamic-queries.md\n- Data Handling: 06-data-handling/03-json-columns.md, 06-data-handling/04-column-naming.md\n",
                    "type": "string"
                  }
                },
                "required": [
                  "filePath"
                ],
                "type": "object"
              },
              "name": "getExoQueryDocs",
              "title": "getExoQueryDocs"
            },
            {
              "annotations": {
                "destructiveHint": true,
                "idempotentHint": false,
                "openWorldHint": true,
                "readOnlyHint": false,
                "title": ""
              },
              "description": "\nAccess multiple ExoQuery documentation sections simultaneously.\n\nThis tool is similar to the single-document retrieval tool but allows fetching multiple documentation files in a single request. This is particularly useful when you need to gather information from several related topics at once.\n\nExoQuery is a Language Integrated Query library for Kotlin Multiplatform that translates Kotlin DSL expressions into SQL at compile time. This resource provides access to the complete documentation covering all aspects of the library.\n\nHOW TO USE THIS RESOURCE:\n\nProvide a list of file paths, where each path is the relative path from the docs root, e.g.:\n  - 01-getting-started/01-introduction.md\n  - 03-query-operations/02-joins.md\n  - 05-advanced-features/01-sql-fragments.md\n\nTo discover available documents, use the MCP resources/list endpoint which will return all available documentation files with their titles, descriptions, and categories.\n\nEach returned document includes:\n- Title and description\n- Category classification\n- Complete markdown content with code examples\n- Cross-references to related topics\n\nWHEN TO USE:\n- User asks about multiple ExoQuery topics that require information from different sections\n- User needs to compare or understand relationships between different features\n- User wants to get comprehensive information across multiple categories\n- More efficient than making multiple single-document requests\n",
              "inputSchema": {
                "properties": {
                  "filePaths": {
                    "description": "\nA list of documentation file paths to retrieve.\n\nFormat: List of relative paths from docs root (e.g., [\"01-getting-started/01-introduction.md\", \"03-query-operations/02-joins.md\"])\n\nEach path follows the pattern used in single-document retrieval: {category-folder}/{file-name}.md\n\nTo find available file paths, use the MCP resources/list endpoint which returns metadata for all documentation files including their paths, titles, categories, and descriptions.\n\nCommon paths:\n- Getting Started: 01-getting-started/01-introduction.md, 01-getting-started/02-installation.md, 01-getting-started/03-quick-start.md\n- Core Concepts: 02-core-concepts/01-sql-blocks.md, 02-core-concepts/02-parameters.md, 02-core-concepts/03-composing-queries.md\n- Query Operations: 03-query-operations/01-basic-operations.md, 03-query-operations/02-joins.md, 03-query-operations/03-grouping.md\n- Actions: 04-actions/01-insert.md, 04-actions/02-update.md, 04-actions/03-delete.md\n- Advanced: 05-advanced-features/01-sql-fragments.md, 05-advanced-features/02-dynamic-queries.md\n- Data Handling: 06-data-handling/03-json-columns.md, 06-data-handling/04-column-naming.md\n",
                    "items": {
                      "type": "string"
                    },
                    "type": "array"
                  }
                },
                "required": [
                  "filePaths"
                ],
                "type": "object"
              },
              "name": "getExoQueryDocsMulti",
              "title": "getExoQueryDocsMulti"
            },
            {
              "annotations": {
                "destructiveHint": true,
                "idempotentHint": false,
                "openWorldHint": true,
                "readOnlyHint": false,
                "title": ""
              },
              "description": "Lists all available ExoQuery documentation resources with their metadata",
              "inputSchema": {
                "properties": {},
                "required": [],
                "type": "object"
              },
              "name": "listExoQueryDocs",
              "title": "listExoQueryDocs"
            },
            {
              "annotations": {
                "destructiveHint": true,
                "idempotentHint": false,
                "openWorldHint": true,
                "readOnlyHint": false,
                "title": ""
              },
              "description": "\nExecute raw, client-provided SQL queries against an ephemeral database initialized with the provided schema.\nReturns query results in a simple JSON format with column headers and row data as a 2D array.\n\nThe database type (SQLite or Postgres) is specified via the databaseType parameter:\n- SQLITE: In-memory, lightweight, uses standard SQLite syntax\n- POSTGRES: Temporary isolated schema with dedicated user, uses PostgreSQL syntax and features\n\nWHEN TO USE: When you need to run your own hand-written SQL queries to test database behavior or\ncompare the output with ExoQuery results from validateAndRunExoquery. This lets you verify that\nExoQuery-generated SQL produces the same results as your expected SQL.\n\nINPUT REQUIREMENTS:\n- query: A valid SQL query (SELECT, INSERT, UPDATE, DELETE, etc.)\n- schema: SQL schema with CREATE TABLE and INSERT statements to initialize the test database\n- databaseType: Either \"SQLITE\" or \"POSTGRES\" (defaults to SQLITE if not specified)\n\nOUTPUT FORMAT:\n\nOn success, returns JSON with the SQL query and a 2D array of results:\n{\"sql\":\"SELECT * FROM users ORDER BY id\",\"output\":[[\"id\",\"name\",\"age\"],[\"1\",\"Alice\",\"30\"],[\"2\",\"Bob\",\"25\"],[\"3\",\"Charlie\",\"35\"]]}\n\nOutput format details:\n- First array element contains column headers\n- Subsequent array elements contain row data\n- All values are returned as strings\n\nOn error, returns JSON with error message and the attempted query (if available):\n{\"error\":\"Query execution failed: no such table: USERS\",\"sql\":\"SELECT * FROM USERS\"}\n\nOr if schema initialization fails:\n{\"error\":\"Database initialization failed due to: near \\\"CREAT\\\": syntax error\\\\nWhen executing the following statement:\\\\n--------\\\\nCREAT TABLE users ...\\\\n--------\",\"sql\":\"CREAT TABLE users ...\"}\n\nEXAMPLE INPUT:\n\nQuery:\nSELECT * FROM users ORDER BY id\n\nSchema:\nCREATE TABLE users (\n  id INTEGER PRIMARY KEY,\n  name TEXT NOT NULL,\n  age INTEGER\n);\n\nINSERT INTO users (id, name, age) VALUES (1, 'Alice', 30);\nINSERT INTO users (id, name, age) VALUES (2, 'Bob', 25);\nINSERT INTO users (id, name, age) VALUES (3, 'Charlie', 35);\n\nEXAMPLE SUCCESS OUTPUT:\n{\"sql\":\"SELECT * FROM users ORDER BY id\",\"output\":[[\"id\",\"name\",\"age\"],[\"1\",\"Alice\",\"30\"],[\"2\",\"Bob\",\"25\"],[\"3\",\"Charlie\",\"35\"]]}\n\nEXAMPLE ERROR OUTPUT (bad table name):\n{\"error\":\"Query execution failed: no such table: invalid_table\",\"sql\":\"SELECT * FROM invalid_table\"}\n\nEXAMPLE ERROR OUTPUT (bad schema):\n{\"error\":\"Database initialization failed due to: near \\\"CREAT\\\": syntax error\\\\nWhen executing the following statement:\\\\n--------\\\\nCREAT TABLE users (id INTEGER)\\\\n--------\\\\nCheck that the initialization SQL is valid and compatible with SQLite.\",\"sql\":\"CREAT TABLE users (id INTEGER)\"}\n\nCOMMON QUERY EXAMPLES:\n\nSelect all rows:\nSELECT * FROM users\n\nSelect specific columns with filtering:\nSELECT name, age FROM users WHERE age > 25\n\nAggregate functions:\nSELECT COUNT(*) as total FROM users\n\nJoin queries:\nSELECT u.name, o.total FROM users u JOIN orders o ON u.id = o.user_id\n\nInsert data:\nINSERT INTO users (name, age) VALUES ('David', 40)\n\nUpdate data:\nUPDATE users SET age = 31 WHERE name = 'Alice'\n\nDelete data:\nDELETE FROM users WHERE age < 25\n\nCount with grouping:\nSELECT age, COUNT(*) as count FROM users GROUP BY age\n\nSCHEMA RULES:\n- Use standard SQLite syntax\n- Table names are case-sensitive (use lowercase for simplicity or quote names)\n- Include INSERT statements to populate test data for meaningful results\n- Supported data types: INTEGER, TEXT, REAL, BLOB, NULL\n- Use INTEGER PRIMARY KEY for auto-increment columns\n- Schema SQL is split on semicolons (;), so each statement after a ';' is executed separately\n- Avoid semicolons in comments as they will cause statement parsing issues\n\nCOMPARISON WITH EXOQUERY:\nThis tool is designed to work alongside validateAndRunExoquery for comparison purposes:\n1. Use validateAndRunExoquery to run ExoQuery Kotlin code and see the generated SQL + results\n2. Use runRawSql with your own hand-written SQL to verify you get the same output\n3. Compare the outputs to ensure ExoQuery generates the SQL you expect\n4. Test edge cases with plain SQL before writing equivalent ExoQuery code\n",
              "inputSchema": {
                "properties": {
                  "query": {
                    "description": "\nA valid SQL query to execute against the database.\n\nCan be any valid SQL statement (syntax depends on databaseType parameter):\n- SELECT queries (with WHERE, JOIN, GROUP BY, ORDER BY, LIMIT, etc.)\n- INSERT statements\n- UPDATE statements\n- DELETE statements\n- DDL statements like CREATE/ALTER/DROP (applied after schema initialization)\n\nThe query will be executed against a database initialized with the provided schema parameter.\n\nExample:\nSELECT * FROM users WHERE age > 25 ORDER BY name\n",
                    "type": "string"
                  },
                  "schema": {
                    "description": "\nSQL schema to initialize the ephemeral test database.\n\nMust include:\n1. CREATE TABLE statements for all tables used in the query\n2. INSERT statements with test data\n\nUse syntax appropriate for the selected databaseType (SQLite or Postgres).\nTable names are case-sensitive. The schema is split on semicolons, so each statement is executed separately.\n\nExample:\nCREATE TABLE users (\n  id INTEGER PRIMARY KEY,\n  name TEXT NOT NULL,\n  age INTEGER\n);\n\nINSERT INTO users (id, name, age) VALUES (1, 'Alice', 30);\nINSERT INTO users (id, name, age) VALUES (2, 'Bob', 25);\nINSERT INTO users (id, name, age) VALUES (3, 'Charlie', 35);\n",
                    "type": "string"
                  }
                },
                "required": [
                  "query",
                  "schema"
                ],
                "type": "object"
              },
              "name": "runRawSql",
              "title": "runRawSql"
            },
            {
              "annotations": {
                "destructiveHint": true,
                "idempotentHint": false,
                "openWorldHint": true,
                "readOnlyHint": false,
                "title": ""
              },
              "description": "\nCompile ExoQuery Kotlin code and EXECUTE it against an Sqlite database with provided schema.\nExoQuery is a compile-time SQL query builder that translates Kotlin DSL expressions into SQL.\n\nWHEN TO USE: When you need to verify ExoQuery produces correct results against actual data.\n\nINPUT REQUIREMENTS:\n- Complete Kotlin code (same requirements as validateExoquery)\n- SQL schema with CREATE TABLE and INSERT statements for test data\n- Data classes MUST exactly match the schema table structure\n- Column names in data classes must match schema (use @SerialName for snake_case columns)\n- Must include or or more .runSample() calls in main() to trigger SQL generation and execution\n  (note that .runSample() is NOT or real production use, use .runOn(database) instead)\n  \n\nOUTPUT FORMAT:\n\nReturns one or more JSON objects, each on its own line. Each object can be:\n\n1. SQL with output (query executed successfully):\n   {\"sql\": \"SELECT u.name FROM \\\"User\\\" u\", \"output\": \"[(name=Alice), (name=Bob)]\"}\n\n2. Output only (e.g., print statements, intermediate results):\n   {\"output\": \"Before: [(id=1, title=Ion Blend Beans)]\"}\n\n3. Error output (runtime errors, exceptions):\n   {\"outputErr\": \"java.sql.SQLException: Table \\\"USERS\\\" not found\"}\n\nMultiple results appear when code has multiple queries or print statements:\n\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25)]\"}\n{\"output\": \"Before:\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"Rows affected: 1\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25), (id=2, title=Luna Fuel Flask, unit_price=89.50, in_stock=6)]\"}\n\nCompilation errors return the same format as validateExoquery:\n{\n  \"errors\": {\n    \"File.kt\": [\n      {\n        \"interval\": {\"start\": {\"line\": 12, \"ch\": 10}, \"end\": {\"line\": 12, \"ch\": 15}},\n        \"message\": \"Type mismatch: inferred type is String but Int was expected\",\n        \"severity\": \"ERROR\",\n        \"className\": \"ERROR\"\n      }\n    ]\n  }\n}\n\nRuntime Errors can have the following format:\n{\n  \"errors\" : {\n    \"File.kt\" : [ ]\n  },\n  \"exception\" : {\n    \"message\" : \"[SQLITE_ERROR] SQL error or missing database (no such table: User)\",\n    \"fullName\" : \"org.sqlite.SQLiteException\",\n    \"stackTrace\" : [ {\n      \"className\" : \"org.sqlite.core.DB\",\n      \"methodName\" : \"newSQLException\",\n      \"fileName\" : \"DB.java\",\n      \"lineNumber\" : 1179\n    }, ...]\n  },\n  \"text\" : \"<outStream><outputObject>\\n{\\\"sql\\\": \\\"SELECT x.id, x.name, x.age FROM User x\\\"}\\n</outputObject>\\n</outStream>\"\n}\nIf there was a SQL query generated before the error, it will appear in the \"text\" field output stream.\n\n\nEXAMPLE INPUT CODE:\n```kotlin\nimport io.exoquery.*\nimport kotlinx.serialization.Serializable\nimport kotlinx.serialization.SerialName\n\n@Serializable\ndata class User(val id: Int, val name: String, val age: Int)\n\n@Serializable\ndata class Order(val id: Int, @SerialName(\"user_id\") val userId: Int, val total: Int)\n\nval userOrders = sql.select {\n    val u = from(Table<User>())\n    val o = join(Table<Order>()) { o -> o.userId == u.id }\n    Triple(u.name, o.total, u.age)\n}\n\nfun main() = userOrders.buildPrettyFor.Sqlite().runSample()\n```\n\nEXAMPLE INPUT SCHEMA:\n```sql\nCREATE TABLE \"User\" (id INT, name VARCHAR(100), age INT);\nCREATE TABLE \"Order\" (id INT, user_id INT, total INT);\n\nINSERT INTO \"User\" (id, name, age) VALUES\n  (1, 'Alice', 30),\n  (2, 'Bob', 25);\n\nINSERT INTO \"Order\" (id, user_id, total) VALUES\n  (1, 1, 100),\n  (2, 1, 200),\n  (3, 2, 150);\n```\n\nEXAMPLE SUCCESS OUTPUT:\n{\"sql\": \"SELECT u.name AS first, o.total AS second, u.age AS third FROM \\\"User\\\" u INNER JOIN \\\"Order\\\" o ON o.user_id = u.id\", \"output\": \"[(first=Alice, second=100, third=30), (first=Alice, second=200, third=30), (first=Bob, second=150, third=25)]\"}\n\nEXAMPLE WITH MULTIPLE OPERATIONS (insert with before/after check):\n{\"output\": \"Before:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans)]\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans), (id=2, title=Luna Fuel Flask)]\"}\n\nEXAMPLE RUNTIME ERROR (if a user divided by zero):\n{\"outputErr\": \"Exception in thread \"main\" java.lang.ArithmeticException: / by zero\"}\n\nKEY PATTERNS:\n\n(See validateExoquery for complete pattern reference)\n\nSummary of most common patterns:\n- Filter: sql { Table<T>().filter { x -> x.field == value } }\n- Select: sql.select { val x = from(Table<T>()); where { ... }; x }\n- Join: sql.select { val a = from(Table<A>()); val b = join(Table<B>()) { b -> b.aId == a.id }; Pair(a, b) }\n- Left join: joinLeft(Table<T>()) { ... } returns nullable\n- Insert: sql { insert<T> { setParams(obj).excluding(id) } }\n- Update: sql { update<T>().set { it.field to value }.where { it.id == x } }\n- Delete: sql { delete<T>().where { it.id == x } }\n\nSCHEMA RULES:\n- Table names should match data class names (case-sensitive, use quotes for exact match)\n- Column names must match @SerialName values or property names\n- Include realistic test data to verify query logic\n- Sqlite database syntax (mostly compatible with standard SQL)\n\nCOMMON PATTERNS:\n- JSON columns: Use VARCHAR for storage, @SqlJsonValue on the nested data class\n- Auto-increment IDs: Use INTEGER PRIMARY KEY\n- Nullable columns: Use Type? in Kotlin, allow NULL in schema\n",
              "inputSchema": {
                "properties": {
                  "code": {
                    "description": "\nComplete ExoQuery Kotlin code to compile and execute.\n\nMust include:\n1. Imports (minimum: io.exoquery.*, kotlinx.serialization.Serializable)\n2. @Serializable data classes that EXACTLY match your schema tables\n3. The query expression\n4. A main() function ending with .buildFor.<Dialect>().runSample()\n    This function MUST be present to trigger SQL generation and execution.\n\nUse @SerialName(\"column_name\") when Kotlin property names differ from SQL column names.\nUse @Contextual for BigDecimal fields.\nUse @SqlJsonValue on data classes that represent JSON column values.\n\nMultiple queries in main() will produce multiple output JSON objects.\n",
                    "type": "string"
                  },
                  "databaseType": {
                    "description": "Database type: SQLITE or POSTGRES (default: SQLITE)",
                    "type": "string"
                  },
                  "schema": {
                    "description": "\nSQL schema to initialize the Sqlite test database.\n\nMust include:\n1. CREATE TABLE statements for all tables referenced in the query\n2. INSERT statements with test data to verify query behavior\n\nTable and column names must exactly match the data classes in the code.\nUse double quotes around table names to preserve case: CREATE TABLE \"User\" (...)\n\nCommon error: Table \"USER\" not found, means you wrote CREATE TABLE User but queried \"User\".\nAlways quote table names in schema to match ExoQuery's generated SQL.\n\nExample:\nCREATE TABLE \"User\" (id INT, name VARCHAR(100), age INT);\nINSERT INTO \"User\" VALUES (1, 'Alice', 30), (2, 'Bob', 25);\n",
                    "type": "string"
                  }
                },
                "required": [
                  "code",
                  "schema"
                ],
                "type": "object"
              },
              "name": "validateAndRunExoquery",
              "title": "validateAndRunExoquery"
            },
            {
              "annotations": {
                "destructiveHint": true,
                "idempotentHint": false,
                "openWorldHint": true,
                "readOnlyHint": false,
                "title": ""
              },
              "description": "\nCompile ExoQuery Kotlin code and EXECUTE it against an Sqlite database with provided schema.\nExoQuery is a compile-time SQL query builder that translates Kotlin DSL expressions into SQL.\n\nWHEN TO USE: When you need to verify ExoQuery produces correct results against actual data.\n\nINPUT REQUIREMENTS:\n- Complete Kotlin code (same requirements as validateExoquery)\n- SQL schema with CREATE TABLE and INSERT statements for test data\n- Data classes MUST exactly match the schema table structure\n- Column names in data classes must match schema (use @SerialName for snake_case columns)\n- Must include or or more .runSample() calls in main() to trigger SQL generation and execution\n  (note that .runSample() is NOT or real production use, use .runOn(database) instead)\n  \n\nOUTPUT FORMAT:\n\nReturns one or more JSON objects, each on its own line. Each object can be:\n\n1. SQL with output (query executed successfully):\n   {\"sql\": \"SELECT u.name FROM \\\"User\\\" u\", \"output\": \"[(name=Alice), (name=Bob)]\"}\n\n2. Output only (e.g., print statements, intermediate results):\n   {\"output\": \"Before: [(id=1, title=Ion Blend Beans)]\"}\n\n3. Error output (runtime errors, exceptions):\n   {\"outputErr\": \"java.sql.SQLException: Table \\\"USERS\\\" not found\"}\n\nMultiple results appear when code has multiple queries or print statements:\n\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25)]\"}\n{\"output\": \"Before:\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"Rows affected: 1\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans, unit_price=32.00, in_stock=25), (id=2, title=Luna Fuel Flask, unit_price=89.50, in_stock=6)]\"}\n\nCompilation errors return the same format as validateExoquery:\n{\n  \"errors\": {\n    \"File.kt\": [\n      {\n        \"interval\": {\"start\": {\"line\": 12, \"ch\": 10}, \"end\": {\"line\": 12, \"ch\": 15}},\n        \"message\": \"Type mismatch: inferred type is String but Int was expected\",\n        \"severity\": \"ERROR\",\n        \"className\": \"ERROR\"\n      }\n    ]\n  }\n}\n\nRuntime Errors can have the following format:\n{\n  \"errors\" : {\n    \"File.kt\" : [ ]\n  },\n  \"exception\" : {\n    \"message\" : \"[SQLITE_ERROR] SQL error or missing database (no such table: User)\",\n    \"fullName\" : \"org.sqlite.SQLiteException\",\n    \"stackTrace\" : [ {\n      \"className\" : \"org.sqlite.core.DB\",\n      \"methodName\" : \"newSQLException\",\n      \"fileName\" : \"DB.java\",\n      \"lineNumber\" : 1179\n    }, ...]\n  },\n  \"text\" : \"<outStream><outputObject>\\n{\\\"sql\\\": \\\"SELECT x.id, x.name, x.age FROM User x\\\"}\\n</outputObject>\\n</outStream>\"\n}\nIf there was a SQL query generated before the error, it will appear in the \"text\" field output stream.\n\n\nEXAMPLE INPUT CODE:\n```kotlin\nimport io.exoquery.*\nimport kotlinx.serialization.Serializable\nimport kotlinx.serialization.SerialName\n\n@Serializable\ndata class User(val id: Int, val name: String, val age: Int)\n\n@Serializable\ndata class Order(val id: Int, @SerialName(\"user_id\") val userId: Int, val total: Int)\n\nval userOrders = sql.select {\n    val u = from(Table<User>())\n    val o = join(Table<Order>()) { o -> o.userId == u.id }\n    Triple(u.name, o.total, u.age)\n}\n\nfun main() = userOrders.buildPrettyFor.Sqlite().runSample()\n```\n\nEXAMPLE INPUT SCHEMA:\n```sql\nCREATE TABLE \"User\" (id INT, name VARCHAR(100), age INT);\nCREATE TABLE \"Order\" (id INT, user_id INT, total INT);\n\nINSERT INTO \"User\" (id, name, age) VALUES\n  (1, 'Alice', 30),\n  (2, 'Bob', 25);\n\nINSERT INTO \"Order\" (id, user_id, total) VALUES\n  (1, 1, 100),\n  (2, 1, 200),\n  (3, 2, 150);\n```\n\nEXAMPLE SUCCESS OUTPUT:\n{\"sql\": \"SELECT u.name AS first, o.total AS second, u.age AS third FROM \\\"User\\\" u INNER JOIN \\\"Order\\\" o ON o.user_id = u.id\", \"output\": \"[(first=Alice, second=100, third=30), (first=Alice, second=200, third=30), (first=Bob, second=150, third=25)]\"}\n\nEXAMPLE WITH MULTIPLE OPERATIONS (insert with before/after check):\n{\"output\": \"Before:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans)]\"}\n{\"sql\": \"INSERT INTO \\\"InventoryItem\\\" (title, unit_price, in_stock) VALUES (?, ?, ?)\", \"output\": \"\"}\n{\"output\": \"After:\"}\n{\"sql\": \"SELECT * FROM \\\"InventoryItem\\\"\", \"output\": \"[(id=1, title=Ion Blend Beans), (id=2, title=Luna Fuel Flask)]\"}\n\nEXAMPLE RUNTIME ERROR (if a user divided by zero):\n{\"outputErr\": \"Exception in thread \"main\" java.lang.ArithmeticException: / by zero\"}\n\nKEY PATTERNS:\n\n(See validateExoquery for complete pattern reference)\n\nSummary of most common patterns:\n- Filter: sql { Table<T>().filter { x -> x.field == value } }\n- Select: sql.select { val x = from(Table<T>()); where { ... }; x }\n- Join: sql.select { val a = from(Table<A>()); val b = join(Table<B>()) { b -> b.aId == a.id }; Pair(a, b) }\n- Left join: joinLeft(Table<T>()) { ... } returns nullable\n- Insert: sql { insert<T> { setParams(obj).excluding(id) } }\n- Update: sql { update<T>().set { it.field to value }.where { it.id == x } }\n- Delete: sql { delete<T>().where { it.id == x } }\n\nSCHEMA RULES:\n- Table names should match data class names (case-sensitive, use quotes for exact match)\n- Column names must match @SerialName values or property names\n- Include realistic test data to verify query logic\n- Sqlite database syntax (mostly compatible with standard SQL)\n\nCOMMON PATTERNS:\n- JSON columns: Use VARCHAR for storage, @SqlJsonValue on the nested data class\n- Auto-increment IDs: Use INTEGER PRIMARY KEY\n- Nullable columns: Use Type? in Kotlin, allow NULL in schema\n",
              "inputSchema": {
                "properties": {
                  "code": {
                    "description": "\nComplete ExoQuery Kotlin code to compile.\n\nMust include:\n1. Imports (minimum: io.exoquery.*, kotlinx.serialization.Serializable)\n2. @Serializable data classes matching your query entities\n3. The query expression using sql { ... } or sql.select { ... }\n4. A main() function ending with .buildFor.<Dialect>().runSample() or .buildPrettyFor.<Dialect>().runSample()\n   This function MUST be present to trigger SQL generation.\n\nThe runSample() function triggers SQL generation but does NOT execute the query for validateExoquery.\n(Note that this is NOT for production ExoQuery usage. For that you use `.runOn(database)`.)\n\nDialect is part of the code (e.g., .buildFor.Postgres()), NOT a separate parameter.\n\nIf compilation fails, check the error interval positions to locate the exact issue in your code.\n",
                    "type": "string"
                  }
                },
                "required": [
                  "code"
                ],
                "type": "object"
              },
              "name": "validateExoquery",
              "title": "validateExoquery"
            }
          ]
        }
      },
      "bad_protocol_status_code": 200,
      "delete_error": null,
      "delete_status_code": null,
      "expired_session_error": null,
      "expired_session_status_code": null,
      "issues": [
        "missing_session_id",
        "missing_protocol_header",
        "bad_protocol_not_rejected"
      ],
      "last_event_id_visible": false,
      "protocol_header_present": false,
      "requested_protocol_version": "2025-03-26",
      "session_id_present": false,
      "transport": "streamable-http"
    }
  },
  "remote_url": "https://backend.exoquery.com/mcp",
  "server_card_payload": null,
  "server_identifier": "com.exoquery/mcp-server"
}

Known versions

Validation history

7 day score delta
+0.0
30 day score delta
+0.5
Recent healthy ratio
100%
Freshness
605.0h
TimestampStatusScoreLatencyTools
Apr 09, 2026 12:56:41 AM UTC Healthy 74.8 782.3 ms 6
Apr 08, 2026 12:52:39 AM UTC Healthy 74.8 1228.1 ms 6
Apr 07, 2026 12:48:39 AM UTC Healthy 74.8 754.9 ms 6
Apr 06, 2026 12:45:22 AM UTC Healthy 74.3 804.9 ms 6
Apr 05, 2026 12:43:08 AM UTC Healthy 74.3 1012.3 ms 6
Apr 04, 2026 12:41:08 AM UTC Healthy 74.3 895.1 ms 6
Apr 03, 2026 12:37:33 AM UTC Healthy 74.3 1075.4 ms 6
Apr 02, 2026 12:23:25 AM UTC Healthy 74.3 1145.4 ms 6

Validation timeline

ValidatedSummaryScoreProtocolAuth modeToolsHigh-risk toolsChanges
Apr 09, 2026 12:56:41 AM UTC Healthy 74.8 2025-03-26 oauth_supported 6 5 none
Apr 08, 2026 12:52:39 AM UTC Healthy 74.8 2025-03-26 oauth_supported 6 5 none
Apr 07, 2026 12:48:39 AM UTC Healthy 74.8 2025-03-26 oauth_supported 6 5 none
Apr 06, 2026 12:45:22 AM UTC Healthy 74.3 2025-03-26 oauth_supported 6 5 none
Apr 05, 2026 12:43:08 AM UTC Healthy 74.3 2025-03-26 oauth_supported 6 5 none
Apr 04, 2026 12:41:08 AM UTC Healthy 74.3 2025-03-26 oauth_supported 6 5 none
Apr 03, 2026 12:37:33 AM UTC Healthy 74.3 2025-03-26 oauth_supported 6 5 none
Apr 02, 2026 12:23:25 AM UTC Healthy 74.3 2025-03-26 oauth_supported 6 5 none
Mar 31, 2026 11:57:31 PM UTC Healthy 74.3 2025-03-26 oauth_supported 6 5 none
Mar 30, 2026 11:49:09 PM UTC Healthy 74.3 2025-03-26 oauth_supported 6 5 none
Mar 29, 2026 11:25:48 PM UTC Healthy 74.3 2025-03-26 oauth_supported 6 5 none
Mar 28, 2026 10:06:46 PM UTC Healthy 74.3 2025-03-26 oauth_supported 6 5 none

Recent validation runs

StartedStatusSummaryLatencyChecks
Apr 09, 2026 12:56:40 AM UTC Completed Healthy 782.3 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe
Apr 08, 2026 12:52:38 AM UTC Completed Healthy 1228.1 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe
Apr 07, 2026 12:48:38 AM UTC Completed Healthy 754.9 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe
Apr 06, 2026 12:45:21 AM UTC Completed Healthy 804.9 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe
Apr 05, 2026 12:43:07 AM UTC Completed Healthy 1012.3 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe
Apr 04, 2026 12:41:07 AM UTC Completed Healthy 895.1 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe
Apr 03, 2026 12:37:32 AM UTC Completed Healthy 1075.4 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe
Apr 02, 2026 12:23:24 AM UTC Completed Healthy 1145.4 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe
Mar 31, 2026 11:57:30 PM UTC Completed Healthy 1068.3 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe
Mar 30, 2026 11:49:08 PM UTC Completed Healthy 736.8 ms action_safety_probe, advanced_capabilities_probe, connector_publishability_probe, connector_replay_probe, determinism_probe, initialize, interactive_flow_probe, oauth_authorization_server, oauth_protected_resource, official_registry_probe, openid_configuration, probe_noise_resilience, prompt_get, prompts_list, protocol_version_probe, provenance_divergence_probe, request_association_probe, resource_read, resources_list, server_card, session_resume_probe, step_up_auth_probe, tool_snapshot_probe, tools_list, transport_compliance_probe, utility_coverage_probe