The Parallelization primitive allows you to run multiple agent requests simultaneously rather than sequentially. By executing independent tasks in parallel, you can significantly reduce total execution time, improve resource utilization, and build more responsive applications.Parallelization is essential for:
Performance Optimization: Reduce total execution time by running tasks concurrently
Scalable Workflows: Handle high-throughput scenarios with parallel processing
When you execute multiple agent requests in parallel:
Dispatch: All requests are sent simultaneously
Parallel Execution: Each agent task runs independently in its own session
Isolation: Tasks don’t share state or interfere with each other
Completion: Tasks complete at their own pace based on complexity
Aggregation: Results are collected and can be combined as needed
Independence Requirement: Parallelized tasks should be independent - they shouldn’t depend on each other’s results. For dependent tasks, use sequential execution or workflows.
// Use different modes for different task complexitiesconst results = await Promise.all([ // Simple calculation - flash mode agentbase.runAgent({ message: "Calculate 15% of 2000", mode: "flash" }), // Standard analysis - base mode agentbase.runAgent({ message: "Analyze this dataset and find trends", mode: "base" }), // Complex reasoning - max mode agentbase.runAgent({ message: "Design a comprehensive system architecture", mode: "max" })]);// Each uses appropriate resources for the task
Process large datasets by splitting them across parallel agents:
Copy
async function batchDataProcessing(records: any[]) { const batchSize = 100; const batches = []; // Split into batches for (let i = 0; i < records.length; i += batchSize) { batches.push(records.slice(i, i + batchSize)); } // Process all batches in parallel const results = await Promise.all( batches.map((batch, index) => agentbase.runAgent({ message: `Process this batch of ${batch.length} records: ${JSON.stringify(batch)} For each record: 1. Validate data format 2. Enrich with additional info 3. Calculate derived fields 4. Return processed results`, system: `You are a data processing agent handling batch ${index + 1}` }) ) ); // Combine results const allProcessedRecords = results.flatMap(r => JSON.parse(r.message) ); return allProcessedRecords;}// Process 1000 records in 10 parallel batchesconst processed = await batchDataProcessing(largeDataset);
Gather information from multiple sources simultaneously:
Copy
async function comprehensiveResearch(topic: string) { // Research different aspects in parallel const [ academicResearch, industryTrends, competitorAnalysis, marketData, technicalSpecs ] = await Promise.all([ agentbase.runAgent({ message: `Research academic papers and studies about ${topic}`, system: "You are an academic research specialist" }), agentbase.runAgent({ message: `Research industry trends and developments in ${topic}`, system: "You are an industry analyst" }), agentbase.runAgent({ message: `Analyze competitors and their approaches to ${topic}`, system: "You are a competitive intelligence analyst" }), agentbase.runAgent({ message: `Gather market size, growth rates, and forecasts for ${topic}`, system: "You are a market research analyst" }), agentbase.runAgent({ message: `Research technical specifications and requirements for ${topic}`, system: "You are a technical analyst" }) ]); // Synthesize all research const synthesis = await agentbase.runAgent({ message: `Synthesize comprehensive research report from multiple sources: Academic Research: ${academicResearch.message} Industry Trends: ${industryTrends.message} Competitor Analysis: ${competitorAnalysis.message} Market Data: ${marketData.message} Technical Specifications: ${technicalSpecs.message} Create an executive summary highlighting key insights from all sources.`, system: "You are a research synthesis specialist" }); return synthesis;}
async function contentCampaign(products: string[]) { // Generate content for all products simultaneously const contentPieces = await Promise.all( products.map(product => Promise.all([ // Product description agentbase.runAgent({ message: `Write a compelling product description for ${product}`, system: "You are a product copywriter" }), // SEO metadata agentbase.runAgent({ message: `Generate SEO-optimized title, meta description, and keywords for ${product}`, system: "You are an SEO specialist" }), // Social media posts agentbase.runAgent({ message: `Create 3 social media posts promoting ${product} for Twitter, LinkedIn, and Instagram`, system: "You are a social media content creator" }), // Email campaign agentbase.runAgent({ message: `Write a promotional email for ${product}`, system: "You are an email marketing specialist" }) ]) ) ); // Organize results by product const campaign = products.map((product, i) => ({ product, description: contentPieces[i][0].message, seo: contentPieces[i][1].message, social: contentPieces[i][2].message, email: contentPieces[i][3].message })); return campaign;}// Generate complete marketing campaign for 10 productsconst campaign = await contentCampaign(productList);
Concurrency Limits: While Agentbase can handle many parallel requests, consider implementing concurrency limits for very large batches to avoid overwhelming your application.
Copy
// Implement concurrency control for large batchesasync function parallelWithConcurrencyLimit<T>( items: T[], limit: number, handler: (item: T) => Promise<any>): Promise<any[]> { const results: any[] = []; const executing: Promise<any>[] = []; for (const item of items) { const promise = handler(item).then(result => { results.push(result); executing.splice(executing.indexOf(promise), 1); }); executing.push(promise); if (executing.length >= limit) { await Promise.race(executing); } } await Promise.all(executing); return results;}// Process 1000 items with max 10 concurrentconst results = await parallelWithConcurrencyLimit( items, 10, item => agentbase.runAgent({ message: `Process ${item}` }));
Monitor memory usage with large parallel operations:
Copy
// Streaming results instead of collecting all in memoryasync function* parallelStream(items: any[]) { const batchSize = 10; for (let i = 0; i < items.length; i += batchSize) { const batch = items.slice(i, i + batchSize); const results = await Promise.all( batch.map(item => agentbase.runAgent({ message: `Process ${item}` }) ) ); for (const result of results) { yield result; } }}// Process results as they comefor await (const result of parallelStream(largeDataset)) { // Process result immediately await saveToDatabase(result); // Don't keep all results in memory}
Remember: Parallelization is most effective for independent tasks of similar complexity. Use it to dramatically reduce execution time for batch operations, multi-source data gathering, and scalable processing.