Building with AI? Start here.

Connect AI to
Your Data. Securely.

The DreamFactory AI Academy teaches you to connect ChatGPT, Claude, Cursor, or any AI agent to enterprise databases, with full security, RBAC, identity passthrough, and audit logging built in. Self-hosted. Your data never leaves your infrastructure.

v7.4
MCP Server Built-In
8
AI Modules
5 min
Avg Time to API
0
New Infra Required
ChatGPT - sales-database
You Show me customers by region Querying via DreamFactory MCP... AI 134 customers across 5 regions You Which have active subscriptions over $100K? Filtering with RBAC policies... AI 32 high-value customers found You Who renews this quarter? AI 8 renewals - $1.2M pipeline
LIVE Chat with your data, setup in ~5 minutes
AI Agent + DreamFactory + Your Database

The AI Academy teaches you to combine these layers into a single governed workflow. No new infrastructure, just new capability on top of DreamFactory.

ChatGPT
Claude
Cursor
Custom
DreamFactory v7.4
AI Data Gateway
RBAC MCP Audit Identity
SQL Server
🐘 PostgreSQL
🐬 MySQL
❄️ Snowflake
🔶 Oracle
🍃 MongoDB
AI Chat Apps
Dashboards
Internal Tools
Micro APIs
Auto Reports
Slack / Teams
Your APIs are already AI-ready.
You just haven't connected them yet.

Data stays put. Governance travels with every request. And you go from question to answer in minutes, not quarters. REST APIs, not SQL. Identity passthrough, not service accounts. Your security infrastructure, not ours.

Every DreamFactory API (every database connection, every stored procedure, every RBAC role) is a building block for AI. The gap isn't infrastructure. It's knowledge.

We built the AI Academy to bridge that gap. Whether you're already running DreamFactory in production or exploring it for the first time, you'll learn how to turn any database into a secure, AI-ready data source in minutes, not months.

With DreamFactory's native MCP Server, your APIs can be consumed by AI tools like ChatGPT, Claude, Cursor, and local LLMs, with full RBAC and identity passthrough intact. The Academy teaches you how.

"DreamFactory has allowed us to remove three or four layers of complexity out of handling the AI." Enterprise AI Implementation Team

🔗 Your APIs, Their AI

AI tools like ChatGPT, Claude, or Cursor can connect directly to your DreamFactory instance via MCP. No new infrastructure, just a config pointing to your existing endpoints.

🔒 Security You Already Have

Identity passthrough means AI queries run as the actual user, not a shared service account. Your RBAC, stored procedure boundaries, and audit logs apply automatically. Research shows all five tested LLM applications were vulnerable to prompt-to-SQL injection. DreamFactory's parameterized queries prevent this by design.

Layers Eliminated

One team eliminated $250K+ in vendor engagements and reduced 140MB of legacy code to kilobytes. Building your own LLM data layer costs $1.5M+ and takes 12-18 months. DreamFactory collapses that to days. AI responses return in 4-16 seconds.

📊 API-First, Not RAG

RAG accuracy for structured enterprise data falls below 60%. Deterministic API queries through DreamFactory achieve 97%. No vector databases, no stale embeddings, no data duplication. Just real-time, governed access to your live data.

Ask ChatGPT Questions About
Your Database

DreamFactory CTO Kevin McGahey connects ChatGPT to a live sales database through DreamFactory's MCP Server in under 5 minutes. No code. No SQL. Just plain English questions.

Create MCP Endpoint
Set up an MCP Server in DreamFactory and connect it to your database, no code required
Connect to ChatGPT
Link the MCP endpoint to ChatGPT using OAuth authentication in under a minute
Query Live Data
Ask plain English questions and get real-time, governed results from your database
Eight modules. Real projects.
Built by our engineering team.

Each module maps to a real engineering blog post and a hands-on project. Build AI-powered features on your own DreamFactory instance, or follow along with a free trial.

MODULE 02

Setting Up Your MCP Server

Configure DreamFactory's native MCP Server (v1.1.0) so AI tools like ChatGPT, Claude, or Cursor can discover and query your databases, with OAuth 2.0 + PKCE authentication and full RBAC.

  • Hybrid PHP/Node.js MCP architecture
  • OAuth 2.0 + PKCE authentication flow
  • StreamableHTTP transport & tool discovery
  • Connect Claude, ChatGPT, or Cursor in minutes
Start Module →
MODULE 03

Security Fundamentals: RBAC, Roles & API Keys

Understand DreamFactory's layered security model: Role-Based Access Control, service-level permissions, table and field restrictions, and API key management. This is the foundation that governs every AI interaction.

  • Four-layer security pipeline
  • Role creation & service access
  • Table & field-level restrictions
  • API key management best practices
Start Module →
MODULE 04

Deterministic Queries & Stored Procedures for AI

Learn why letting AI generate raw SQL is dangerous and how stored procedures create security contracts that make AI data access predictable, auditable, and safe.

  • Why ad-hoc AI SQL is risky in production
  • Stored procs as security contracts
  • Four AI-safe data access patterns
  • GDPR, HIPAA & FedRAMP compliance
Start Module →
MODULE 05

Identity Passthrough for AI Agents

When an AI queries data on behalf of a user, who runs the query? Learn to implement identity passthrough so AI respects user permissions and generates meaningful audit trails.

  • Token forwarding & delegation patterns
  • User-scoped AI queries
  • Blast radius containment
  • Azure AD / SAML / OAuth integration
Start Module →
MODULE 06

Building REST APIs for AI Consumption

Understand how REST APIs work under the hood, from Node.js/Express patterns to DreamFactory's zero-code generation, and how to design endpoints that AI agents consume efficiently.

  • REST API architecture fundamentals
  • DreamFactory vs. hand-coded APIs
  • Swagger/OpenAPI for AI discovery
  • Designing AI-friendly response shapes
Start Module →
MODULE 07

API Documentation That AI Can Read

Great API docs aren't just for humans anymore. Learn to generate documentation that both developers and AI agents can parse, discover, and act on reliably.

  • Auto-generated Swagger from DreamFactory
  • OpenAPI specs as AI context
  • MCP tool discovery patterns
  • Documentation best practices
Start Module →
MODULE 08 | CAPSTONE

Connect Local AI to Enterprise Databases

Walk through a real production implementation: connect a self-hosted LLM to SQL Server through DreamFactory's AI Data Gateway to generate AI-powered data summaries for any use case.

  • Python scripted service orchestration
  • Stored procedure → AI prompt pipeline
  • Timeout handling & response parsing
  • On-premise LLM deployment patterns
Start Module →
70 Ready-to-Use Prompts for
DreamFactory + AI

Copy-paste prompts optimized for each DreamFactory capability. Use with Claude, ChatGPT, Cursor, or any MCP-connected AI assistant.

1
"Use list_apis to show me every database and file service connected to this DreamFactory instance"
MCP discovery: always start here to see what's available
2
"Call {db}_get_data_model to get the full schema (tables, columns, foreign keys, and row counts), then explain the data relationships"
Condensed schema discovery: the recommended starting point for any database
3
"Use all_find_table to search for a table named [keyword] across every connected database"
Cross-service table search: finds data across all connected databases at once
4
"Use {db}_get_table_data to show me all [table] records where [field] = '[value]', sorted by [column] descending, limit 50"
Filtered queries with sorting and pagination (1,000 record hard cap per request)
5
"Use {db}_aggregate_data to calculate SUM, COUNT, and AVG of [metric] grouped by [dimension] for the last [period]"
Built-in aggregation tool: SUM, COUNT, AVG, MIN, MAX with GROUP BY
6
"Call {db}_call_stored_procedure to execute [procedure_name] with parameters [param1]=[value1], [param2]=[value2]"
Execute stored procedures: deterministic, auditable, immune to SQL injection
7
"Use {fs}_list_files to browse the [path] directory, then {fs}_get_file to read the contents of [filename]"
File storage MCP tools: list and read from S3, Azure Blob, SFTP, WebDAV
8
"Use all_get_tables to list every table across all connected databases, then find where [entity] data lives"
Cross-database table inventory: see everything in one call
9
"Use {db}_get_stored_procedures to list available procs, then call the one that handles [business_task]"
Discover and execute stored procedures, the safest way for AI to write data
10
"First list_apis, then {db}_get_data_model for each service, then summarize what data this DreamFactory instance manages"
Full instance audit: multi-step discovery flow across all services
1
"Show me all customers who placed orders over $1,000 in the last 30 days. Include contact info, order totals, and last order date"
AI uses {db}_get_table_data with filters and related table lookups
2
"Show me inactive customers with the highest lifetime value. Who should we re-engage first?"
The CEO demo scenario: AI queries, filters, and ranks business insights
3
"Generate a revenue summary grouped by region and product category for Q4, using {db}_aggregate_data with SUM and COUNT"
Aggregation queries: GROUP BY with SUM, COUNT, AVG, MIN, MAX
4
"Find all records in [table] where [field] contains anomalies or values outside 2 standard deviations from the mean"
Data quality checks: AI uses aggregate_data for stats, then get_table_data for outliers
5
"Execute the GetMonthlyReport stored procedure with @StartDate and @EndDate parameters and summarize the results"
Stored procedure execution: deterministic queries reviewed and approved before deployment
6
"Use {db}_create_records to insert these new entries into [table] and confirm what was created"
Record creation via MCP: respects RBAC, field restrictions, and server-side filters
7
"Compare [table] data between the MySQL service and PostgreSQL service. Find records that exist in one but not the other"
Cross-database comparison: AI queries multiple services via MCP and diffs results
8
"Set up virtual foreign keys between [table1] and [table2] so AI can follow relationships across tables without native FK support"
Virtual relationships: enables related lookups on NoSQL and legacy databases
9
"Show me the {db}_get_table_schema for [table]. I need column types, constraints, defaults, and nullable fields"
Deep schema inspection: understand field-level details before writing queries
10
"Get the OpenAPI spec for [service] using {db}_get_api_spec and explain the available query parameters and filter syntax"
API spec retrieval: shows AI the exact query syntax for DreamFactory REST endpoints
1
"Use {fs}_list_files to show me everything in the [folder_path] directory on my [S3/Azure Blob/SFTP] storage"
File discovery: browse cloud and on-premise storage via MCP
2
"Use {fs}_get_file to read the CSV at [path], then parse the data and summarize it for me"
File read with auto-detection: text returned directly, binary as base64
3
"Read the CSV from S3 using {fs}_get_file, then insert each row into [database_table] using {db}_create_records"
File-to-database ETL pipeline: AI reads files and writes to databases in one flow
4
"Use {fs}_create_file to save this report as a JSON file at [path] on [storage_service]"
File creation: write query results or reports back to cloud storage
5
"Browse the /reports/ directory with {fs}_list_files, read the latest file, and compare it with last month's report"
File browsing and analysis: AI navigates storage and compares documents
6
"Read the JSON config at [path] using {fs}_get_file, update the [setting] value, and write it back with {fs}_create_file"
Configuration management: read, modify, and save config files via MCP
7
"List all files on my SFTP server, then read and summarize each text file in the /inbox/ folder"
Batch file processing: iterate through files and extract insights
8
"Query the database for [data], format it as CSV, then use {fs}_create_file to save it to S3 at /exports/[filename].csv"
Database-to-file export: AI queries data and writes results to cloud storage
9
"Use {fs}_delete_file to clean up processed files in /archive/ that are no longer needed"
File cleanup: delete files via MCP with role-based write permissions
10
"Read the image file at [path] using {fs}_get_file and describe what you see in the image"
Multimodal file access: images and audio returned as base64 for AI analysis
1
"Create a role that allows GET-only access to [tables] and completely blocks [sensitive_tables] from AI queries"
RBAC setup: service, table, verb, and field-level restrictions enforced on every request
2
"Configure identity passthrough so each AI user's queries run under their own credentials, not a shared service account"
Identity passthrough: AI inherits the real user's permissions and audit identity
3
"Set up server-side row filters so the sales role can only see records where region = their assigned territory"
Row-level security: automatic WHERE clauses the AI cannot bypass or override
4
"Configure the MCP service OAuth 2.0 + PKCE settings. Set access token TTL to 1 hour and refresh tokens to 7 days"
MCP authentication: OAuth credentials auto-generated when you create an MCP service
5
"Create separate API keys for Claude, ChatGPT, and Cursor, each bound to its own role with different access levels"
API key isolation: one key per AI tool, never shared across clients
6
"Mask the SSN and salary columns on the employees table so AI can query employee data but never see sensitive fields"
Column-level restrictions: AI only sees explicitly exposed fields, nothing else
7
"Configure Azure AD SSO with group-to-role mapping so our Azure groups automatically get the right DreamFactory permissions"
Enterprise SSO: Azure AD, Okta, SAML 2.0 with automatic role assignment
8
"Only allow AI write operations through stored procedures. Block direct INSERT, UPDATE, DELETE on all tables for the AI role"
Stored procedure security contracts: AI gets EXECUTE only, no direct table writes
9
"Configure JWT validation using an external JWKS endpoint for [Okta/Auth0/Azure AD] so tokens are verified by DreamFactory"
JWT + JWKS: RS256/ES256 tokens from external IdPs mapped to DreamFactory roles
10
"Set up rate limiting per API key. 100 requests per minute for the AI role, 1000 for the admin role"
Rate limiting: per-key throttling prevents AI from overwhelming your database
1
"Use {db}_get_data_model on my Snowflake service to discover all schemas, tables, and column types"
Snowflake schema discovery via MCP: see your entire warehouse structure
2
"Query [table] in Snowflake where [conditions], sorted by [column]. Use {db}_get_table_data with limit 500"
Snowflake queries through DreamFactory: filtered, sorted, paginated via MCP
3
"Use {db}_aggregate_data on my Snowflake service to calculate total revenue, order count, and average order value grouped by month"
Snowflake aggregation: SUM, COUNT, AVG computed server-side, no raw data transfer
4
"Call the Snowflake stored procedure [proc_name] via {db}_call_stored_procedure with parameters for date range and department"
Snowflake stored procedures: secure, pre-approved queries via MCP
5
"Get data from Snowflake [table1] and SQL Server [table2] via their respective MCP services, then compare and join the results"
Cross-database federation: AI queries multiple services and merges results
6
"Analyze my Snowflake data to identify the top 10 customers by lifetime spend, then show their recent order patterns"
Multi-step analytics: AI chains aggregate and detail queries for insights
7
"Run {db}_get_stored_procedures to see what's available in Snowflake, then describe what each procedure does"
Procedure discovery: AI inventories available stored procedures and explains them
8
"Show me the Snowflake data model, then build a report: revenue by product category, customer segment, and quarter"
Schema-driven reporting: discover structure first, then query intelligently
9
"Configure a Snowflake-only role that allows read access to analytics tables but blocks write access and PII columns"
Snowflake RBAC: DreamFactory enforces access control on top of Snowflake native permissions
10
"Create new records in the Snowflake [table] using {db}_create_records with the data I provide"
Snowflake writes via MCP: AI creates records with RBAC enforcement
1
"Build a Python3 scripted service that calls a stored procedure, formats the data, sends it to an LLM, and returns an AI summary"
The AI Summary Pipeline, the production pattern from Module 08
2
"Create a pre-process event script that validates [fields] before allowing INSERT into [table]. Reject bad data with clear errors"
Input validation: intercept writes before they reach the database
3
"Write a post-process event script that sends a Slack webhook notification whenever a record is created in [table]"
Event-driven notifications: trigger external alerts on database changes
4
"Create an event script that logs every change to [sensitive_table]. Capture old values, new values, user identity, and timestamp"
Audit logging: change data capture for HIPAA, SOC 2, and GDPR compliance
5
"Write a Python3 script that transforms the API response from [format1] to [format2] before returning it to the client"
Response transformation: reshape data for specific client or AI agent needs
6
"Build a scripted service that enriches [data] by calling [external_api], then merges the enriched data before saving to the database"
Data enrichment pipeline: combine internal data with external API responses
7
"Create a scripted service that queries [database], formats results as a prompt, calls [LLM_endpoint], and returns the AI analysis"
Custom AI orchestration: database, prompt, LLM, response in one API call
8
"Write a script that syncs data between [source_db] and [target_db] with conflict resolution. Use urllib with explicit timeouts"
Data sync: use urllib timeouts to avoid PHP-FPM worker pool deadlock
9
"Build a scripted service that generates a formatted report from database data and saves it to S3 using the file storage API"
Report generation: query data, format output, write to cloud storage
10
"Create a pre-process script that redacts PII fields (SSN, email, phone) from API responses before they reach the AI agent"
PII redaction: script-level data masking for compliance requirements
1
"Create an HTTP service in DreamFactory that proxies [external_api_url]. Add DreamFactory RBAC and rate limiting on top"
API proxy: wrap any external API with DreamFactory security controls
2
"Set up an HTTP service that connects to [SaaS_API] with OAuth2 credentials, so DreamFactory handles token management"
OAuth2 for external APIs: DreamFactory manages token refresh automatically
3
"Create an HTTP service for our internal microservice at [url] and restrict it to the engineering role only"
Internal API gateway: secure microservice access through DreamFactory roles
4
"Configure the HTTP service to add custom headers (X-Correlation-ID, Authorization) to every proxied request"
Header injection: add tracing, auth, and metadata to outbound requests
5
"Use the HTTP service to fetch data from [external_api], then combine it with [database] data from DreamFactory in one response"
Data mashup: combine external API data with internal database data
6
"Set up an HTTP service pointing to our LLM API endpoint so DreamFactory can proxy AI model calls with rate limiting"
LLM proxy: route AI model calls through DreamFactory for cost control
7
"Create HTTP services for [api1], [api2], and [api3] so AI can query all three through DreamFactory with unified security"
Multi-API aggregation: unify multiple external APIs under one security layer
8
"Generate the OpenAPI spec for my HTTP service so AI agents can discover available endpoints and parameters"
API documentation: auto-generated specs help AI understand external services
9
"Set up request/response logging for the [http_service] so we can debug API integration issues and monitor usage"
API debugging: log requests and responses for troubleshooting
10
"Connect DreamFactory to [CRM/ERP/payment API] via HTTP service so AI can query business system data alongside our databases"
Business system integration: give AI access to SaaS data through DreamFactory
Stored Procedure → AI Summary Pipeline

A production pattern used by enterprise teams, achieving 4-16 second response times while eliminating layers of vendor complexity.

architecture - stored-proc-to-ai-pipeline.py
# Step 1: Client calls DreamFactory scripted service GET /api/v2/AISummary?record_id=123&user=alice.sales Header: X-DreamFactory-Api-Key: [key] # Step 2: Script calls stored procedure (deterministic query) proc_url = f"{BASE_URL}/api/v2/MainDatabase/_proc/GetRecordSummaryData" proc_payload = { "params": [{ "name": "RecordId", "value": record_id }, { "name": "RequestingUser", "value": requesting_user }] } # → Returns only approved fields, user-scoped data # Step 3: Format data and send to AI model (local or cloud) ai_url = f"{BASE_URL}/api/v2/LocalAI/chat/completions" ai_payload = { "model": AI_MODEL, "messages": [{ "role": "user", "content": prompt }], "temperature": 0.3 } # → Identity passthrough: AI knows Alice is asking # → RBAC enforced: Alice only sees her own records # → Timeout: 300s for LLM inference (not the default 30s) # Step 4: Return AI-generated summary as JSON Deterministic stored procedure ✓ Identity passthrough ✓ Full audit trail ✓
💡 From the Field
"We discovered our 14 billion parameter model actually outperformed our 120 billion parameter model for summarization tasks. The smaller model focused on language and context without overthinking."

Lesson: Always benchmark your specific use case. Bigger isn't always better, and smaller models mean faster responses and lower infrastructure costs.

Note: This is the scripted-service orchestration pattern (Module 08). Alternatively, AI agents can call stored procedures directly via MCP using {db}_call_stored_procedure (see Modules 01 to 04).

Why RAG Falls Short for Structured Data

Vector databases and embeddings work for unstructured documents, but your structured enterprise data deserves deterministic, real-time access.

RAG for Structured Data
  • Accuracy frequently falls below 60%
  • Stale data from offline indexing cycles
  • Shadow copies in vector databases increase security risk
  • Freeform SQL generation invites injection attacks
  • Re-indexing dynamic data is resource-intensive
API-First via DreamFactory
  • 97% accuracy with deterministic queries
  • Real-time data, no indexing, no staleness
  • Data stays in source systems, zero duplication
  • Parameterized queries immune to injection
  • 70-80% infrastructure cost reduction
97%
Query accuracy
70-80%
Cost reduction vs RAG
90%
Fewer escalations
Real-time
No indexing delay

Source: The API-First Alternative to RAG for Structured Data

Learn from what we're building.

Every Academy module is backed by a deep-dive blog post from the DreamFactory engineering team. Start here.

Video Demo

Ask ChatGPT Questions About YOUR Database

CTO Kevin McGahey connects ChatGPT to a live sales database through DreamFactory's MCP Server in under 5 minutes. No code, no SQL. Just plain English questions and instant answers.

by Kevin McGahey · Jan 5, 2026
Watch video →
AI + Claude

Give Claude Access to Your Database and Start a Conversation with Your Data

Connect Anthropic's Claude to your enterprise data through DreamFactory. Real-time analytics, trend analysis, and root cause exploration, all through natural language conversation.

by DreamFactory · Feb 20, 2026
Read article →
Data Governance

AI Data Gateways & Data Governance: Scaling Trustworthy LLM Agents

How to give AI agents enough access to unlock business value without compromising privacy, compliance, or control. A 7-step repeatable governance workflow for enterprise teams.

by DreamFactory · Feb 13, 2026
Read article →
Build vs Buy

The Hidden Cost of Building Your Own LLM Data Layer

Building your own costs $1.5M+ and takes 12-18 months. One Fortune 500 integrated 50 data sources in 2 weeks. A clear-eyed analysis of build vs. buy for AI data infrastructure.

by DreamFactory · Feb 10, 2026
Read article →
CISO Security

What Your CISO Needs to Know About LLM Database Access

All five tested LLM applications were vulnerable to prompt-to-SQL injection. How API abstraction, parameterized queries, and field-level masking prevent the attacks that keep CISOs up at night.

by Cody Lord · Feb 9, 2026
Read article →
MCP Setup

How to Connect LLM Chat and AI Agents to Enterprise Data Using Built-In MCP

Step-by-step guide to configuring DreamFactory's native MCP Server. Connect ChatGPT, Claude, or any MCP client with OAuth 2.0 authentication. Most teams finish in under 10 minutes.

by Kevin Hood · Feb 6, 2026
Read article →
API vs RAG

The API-First Alternative to RAG for Structured Data

RAG accuracy for structured data falls below 60%. Deterministic API queries achieve 97%. Why the API-first approach eliminates vector databases, reduces costs 70-80%, and cuts escalations by 90%.

by Konnor Kurilla · Feb 5, 2026
Read article →
Enterprise Security

Enterprise Guide: Securing LLM Access to Your Databases

A 5-layer security framework for AI database access: governed REST APIs, identity passthrough with RBAC, deterministic queries, rate limiting, and MCP for local LLMs.

by Terence Bennett · Feb 4, 2026
Read article →
AI + LLM

Connect Your Local AI Model to Enterprise Databases with DreamFactory

A real integration story: connecting a self-hosted AI model to SQL Server via DreamFactory scripted services. Covers timeouts, auth issues, and production patterns.

by Kevin McGahey · Feb 2, 2026
Read article →
Stored Procedures

Why Deterministic Queries and Stored Procedures Are the Future of AI Data Access

Why letting AI generate ad-hoc SQL is dangerous, and how stored procedures create the security contracts AI agents need for safe, auditable enterprise data access.

by Kevin McGahey · Jan 29, 2026
Read article →
Identity + Security

Identity Passthrough for AI: Why Your LLM Needs to Know Who's Asking

When AI queries data, who runs the query? Why shared service accounts break security, and how token forwarding keeps user identity intact through the entire pipeline.

by Nic Davidson · Jan 27, 2026
Read article →
MCP Server

DreamFactory 7.4.0: MCP Server Integration & Azure AD Group Mapping

The release that made DreamFactory natively AI-ready. Native MCP Server support, Azure AD group-to-role mapping, and critical security hardening.

by Kevin McGahey · Jan 27, 2026
Read article →
Official Docs

DreamFactory MCP Server Documentation

Complete reference for MCP server setup, 22 tool patterns, OAuth 2.0 authentication, custom login pages, and FAQ.

by DreamFactory · docs.dreamfactory.com
Read docs →
DreamFactory + NVIDIA Offer

Get a Free NVIDIA DGX Spark
Not a Contest. Not a Raffle.

We're giving away 10 DGX Sparks to help enterprises build local AI connected to their data. New and existing customers are eligible. First 10 to commit get one.

  • 📝
    Commit to a DreamFactory Production License. Whether you're a new or existing customer, sign or upgrade to a production license and qualify for a free DGX Spark
  • 🛠
    40 Hours of Dev Time Included. Our engineers work directly with your team to build local AI solutions
  • Real Results, Fast. One customer automated thousands of hours of manual review in just 4 hours
  • 🔒
    On-Premise AI. Keep your enterprise data secure while running powerful AI models locally

First come, first served. Book a call to discuss your AI project and DGX Spark eligibility, or DM Terence Bennett on LinkedIn.

DGX Spark
1 PF
FP4 AI Performance
128 GB
Unified Memory
200B
Parameter Models
4 TB
Encrypted Storage

Don't get left behind.
Start building with AI today.

Your DreamFactory instance is already AI-ready. Connect any AI agent to your database in about 5 minutes, then let anyone in your organization ask questions in plain English. The Academy shows you how, from first query to production AI pipelines.

Free for all DreamFactory customers. New modules added as our team learns.