Connect agents to databases, data warehouses, and data sources
Data Connectors enable agents to read from and write to databases, data warehouses, and various data sources, making it easy to work with structured and unstructured data across your organization.
The Data Connectors primitive provides agents with direct access to your data infrastructure. Whether you’re working with SQL databases, NoSQL stores, data warehouses, or cloud storage, agents can query, analyze, and manipulate data using natural language.Data Connectors are essential for:
Database Access: Query and update SQL and NoSQL databases
Data Analysis: Analyze data across multiple sources
Data Synchronization: Keep data in sync across systems
Reporting: Generate reports from live data
ETL Operations: Extract, transform, and load data
Data Validation: Verify data integrity and quality
SQL Databases
Connect to PostgreSQL, MySQL, SQL Server, and more
NoSQL Stores
Access MongoDB, Redis, DynamoDB, and other NoSQL databases
Data Warehouses
Query Snowflake, BigQuery, Redshift, and analytics platforms
Cloud Storage
Read and write to S3, GCS, Azure Blob, and object storage
// Query across multiple databasesconst result = await agentbase.runAgent({ message: "Compare customer data between production DB and analytics warehouse", dataConnectors: { postgres: { enabled: true, connectionString: process.env.PRODUCTION_DB_URL, alias: "production" }, snowflake: { enabled: true, connection: snowflakeConfig, alias: "analytics" } }, system: `You have access to two databases: - production: Live PostgreSQL database - analytics: Snowflake data warehouse Compare customer counts and identify any discrepancies.`});// Agent queries both sources and compares
const analyticsAgent = await agentbase.runAgent({ message: "Create a summary of key business metrics for this week", dataConnectors: { snowflake: { enabled: true, connection: snowflakeConfig } }, system: `Generate analytics report including: 1. Total revenue this week vs last week 2. New customer signups 3. Top 5 products by sales 4. Average order value 5. Customer churn rate Present results in a structured format with percentage changes.`});// Agent queries warehouse and generates comprehensive reportconsole.log('Analytics:', analyticsAgent.report);
const customerLookup = await agentbase.runAgent({ message: "Find customer information for email: [email protected]", dataConnectors: { postgres: { enabled: true, connectionString: process.env.DATABASE_URL } }, system: `Look up customer and return: - Basic info (name, email, phone) - Account status and tier - Recent orders (last 5) - Support tickets (open and recent closed) - Lifetime value - Last interaction date Format as a customer profile card.`});// Returns comprehensive customer dataconsole.log('Customer profile:', customerLookup.profile);
const reportAgent = await agentbase.runAgent({ message: "Generate monthly sales report for January 2024", dataConnectors: { postgres: { enabled: true, connectionString: process.env.DATABASE_URL } }, system: `Generate comprehensive sales report: 1. Total sales by product category 2. Sales by region 3. Top 10 customers by revenue 4. Sales rep performance 5. Month-over-month growth 6. Forecast for next month based on trends Include visualizations and insights.`});// Agent queries data and generates formatted report
system: `When querying data:- Always use LIMIT clause for large tables- Default to 100 rows unless more are specifically needed- Use pagination for large result sets- Warn if query would return more than 1000 rows`
Use Indexes Effectively
Copy
message: `Analyze query performance and suggest indexes`,system: `When running queries:- Use EXPLAIN to analyze query plans- Identify missing indexes- Suggest index creation for slow queries- Avoid table scans on large tables`
Batch Operations
Copy
// Process data in batchesconst result = await agentbase.runAgent({ message: "Update all customer records to add loyalty_points field", dataConnectors: { postgres: { enabled: true, connectionString: process.env.DATABASE_URL, permissions: { write: true } } }, system: `Update customers in batches: - Process 1000 records at a time - Use transactions for consistency - Commit after each batch - Log progress`});
system: `For write operations:- Always use transactions- Validate data before commit- Rollback on any error- Log transaction details`
Validate Before Write
Copy
system: `Before inserting/updating data:- Validate required fields are present- Check data types match schema- Verify foreign key references exist- Ensure unique constraints won't be violated- Confirm data is within valid ranges`
Pro Tip: Use read-only replicas for analytics queries to avoid impacting production database performance. Agent can automatically route queries to appropriate databases.