Prerequisites
- Basic TypeScript syntax and types π
- Understanding of Promises and async programming β‘
- Promise chaining fundamentals π
What you'll learn
- Master Promise.all for parallel async operations π
- Optimize performance with concurrent data fetching ποΈ
- Handle type-safe parallel processing with error resilience π‘οΈ
- Build high-performance async workflows and APIs π‘
π― Introduction
Welcome to the world of parallel async operations with Promise.all! π In this guide, weβll explore how to unleash the full power of concurrent programming in TypeScript for lightning-fast applications.
Youβll discover how to execute multiple async operations simultaneously, dramatically reducing wait times and creating responsive user experiences. Whether youβre fetching data from multiple APIs π, processing files in parallel π, or orchestrating complex workflows π, mastering Promise.all is essential for building high-performance TypeScript applications.
By the end of this tutorial, youβll be creating async powerhouses that execute operations in parallel like a well-oiled machine! π Letβs dive in! πββοΈ
π Understanding Promise.all
π€ What is Promise.all?
Promise.all is like having a team of workers executing tasks simultaneously π₯. Think of it as the difference between cooking a meal one dish at a time versus having multiple chefs working on different dishes in parallel - the parallel approach is much faster!
In TypeScript terms, Promise.all provides:
- β¨ Parallel execution - multiple operations run simultaneously
- π Performance optimization - significantly faster than sequential operations
- π‘οΈ Type safety - maintains strong typing for all results
- π¦ All-or-nothing - succeeds only when all operations succeed
π‘ Why Use Promise.all?
Hereβs why Promise.all is a game-changer:
- Performance Boost ποΈ: Execute operations concurrently vs sequentially
- Resource Efficiency πͺ: Better utilization of CPU and network resources
- User Experience π: Faster loading times and responsive interfaces
- Batch Operations π¦: Process multiple items simultaneously
- API Optimization π‘: Reduce total request time with parallel calls
Real-world example: Loading a dashboard that needs user data, recent activity, and notifications. Instead of waiting 300ms + 200ms + 100ms = 600ms sequentially, Promise.all executes them in parallel for just ~300ms total! β‘
π§ Basic Promise.all Patterns
π Simple Parallel Execution
Letβs start with fundamental Promise.all usage:
// π― Basic Promise.all syntax and type inference
// TypeScript automatically infers the result tuple types
// π Simple parallel data fetching
const fetchUserDashboardData = async (userId: string) => {
console.log('π Starting parallel data fetch...');
// π Execute all operations in parallel
const [userProfile, userActivity, userNotifications] = await Promise.all([
fetchUserProfile(userId), // Returns Promise<UserProfile>
fetchUserActivity(userId), // Returns Promise<UserActivity>
fetchUserNotifications(userId) // Returns Promise<UserNotification[]>
]);
console.log('β
All data fetched in parallel!');
// π― TypeScript knows the exact types of each result
return {
profile: userProfile, // Type: UserProfile
activity: userActivity, // Type: UserActivity
notifications: userNotifications // Type: UserNotification[]
};
};
// ποΈ Type definitions for our data
interface UserProfile {
id: string;
name: string;
email: string;
avatar: string;
joinedAt: Date;
}
interface UserActivity {
lastLogin: Date;
sessionsThisWeek: number;
actionsPerformed: number;
favoriteFeatures: string[];
}
interface UserNotification {
id: string;
message: string;
type: 'info' | 'warning' | 'success' | 'error';
createdAt: Date;
isRead: boolean;
}
// π οΈ Mock API functions with realistic delays
const fetchUserProfile = async (userId: string): Promise<UserProfile> => {
console.log('π€ Fetching user profile...');
await new Promise(resolve => setTimeout(resolve, 300)); // 300ms delay
return {
id: userId,
name: 'John Doe',
email: '[email protected]',
avatar: 'https://avatar.example.com/john.jpg',
joinedAt: new Date('2023-01-15')
};
};
const fetchUserActivity = async (userId: string): Promise<UserActivity> => {
console.log('π Fetching user activity...');
await new Promise(resolve => setTimeout(resolve, 200)); // 200ms delay
return {
lastLogin: new Date(),
sessionsThisWeek: 12,
actionsPerformed: 847,
favoriteFeatures: ['Dashboard', 'Reports', 'Analytics']
};
};
const fetchUserNotifications = async (userId: string): Promise<UserNotification[]> => {
console.log('π Fetching notifications...');
await new Promise(resolve => setTimeout(resolve, 100)); // 100ms delay
return [
{
id: 'notif_1',
message: 'Welcome to your dashboard! π',
type: 'success',
createdAt: new Date(Date.now() - 3600000), // 1 hour ago
isRead: false
},
{
id: 'notif_2',
message: 'Your weekly report is ready π',
type: 'info',
createdAt: new Date(Date.now() - 7200000), // 2 hours ago
isRead: true
}
];
};
// π Usage - notice how all operations run in parallel!
const loadDashboard = async () => {
const startTime = Date.now();
const dashboardData = await fetchUserDashboardData('user_123');
const endTime = Date.now();
console.log(`β‘ Total time: ${endTime - startTime}ms`); // ~300ms instead of 600ms!
return dashboardData;
};
π Processing Arrays in Parallel
// π― Process multiple items concurrently
interface Product {
id: string;
name: string;
price: number;
stock: number;
category: string;
}
interface ProductWithDetails {
product: Product;
reviews: number;
averageRating: number;
isPopular: boolean;
}
// π¦ Enrich multiple products with additional data in parallel
const enrichProductsData = async (productIds: string[]): Promise<ProductWithDetails[]> => {
console.log(`π Enriching ${productIds.length} products in parallel...`);
// π Map each product ID to an enrichment promise
const enrichmentPromises = productIds.map(async (productId): Promise<ProductWithDetails> => {
// π― For each product, fetch base data and details in parallel
const [product, reviewData, popularityData] = await Promise.all([
fetchProduct(productId),
fetchProductReviews(productId),
checkProductPopularity(productId)
]);
return {
product,
reviews: reviewData.count,
averageRating: reviewData.averageRating,
isPopular: popularityData.isPopular
};
});
// π Execute all enrichment operations in parallel
const enrichedProducts = await Promise.all(enrichmentPromises);
console.log('β
All products enriched successfully!');
return enrichedProducts;
};
// π οΈ Helper functions for product data
const fetchProduct = async (productId: string): Promise<Product> => {
await new Promise(resolve => setTimeout(resolve, 150));
return {
id: productId,
name: `Product ${productId}`,
price: Math.floor(Math.random() * 200) + 20,
stock: Math.floor(Math.random() * 100),
category: 'Electronics'
};
};
const fetchProductReviews = async (productId: string): Promise<{count: number, averageRating: number}> => {
await new Promise(resolve => setTimeout(resolve, 100));
return {
count: Math.floor(Math.random() * 500) + 10,
averageRating: Math.round((Math.random() * 2 + 3) * 10) / 10 // 3.0 - 5.0
};
};
const checkProductPopularity = async (productId: string): Promise<{isPopular: boolean}> => {
await new Promise(resolve => setTimeout(resolve, 80));
return {
isPopular: Math.random() > 0.6 // 40% chance of being popular
};
};
π Real-World API Orchestration
π‘ Multi-Service Data Aggregation
// π― Real-world example: E-commerce homepage data aggregation
interface HomePageData {
featuredProducts: Product[];
categories: Category[];
userRecommendations: Product[];
promotions: Promotion[];
userInfo: UserProfile | null;
siteStats: SiteStatistics;
}
interface Category {
id: string;
name: string;
productCount: number;
imageUrl: string;
}
interface Promotion {
id: string;
title: string;
description: string;
discountPercentage: number;
validUntil: Date;
}
interface SiteStatistics {
totalProducts: number;
activeUsers: number;
ordersToday: number;
averageRating: number;
}
// π Load complete homepage data with intelligent parallel fetching
const loadHomePageData = async (userId?: string): Promise<HomePageData> => {
console.log('π Loading homepage data...');
// π― Phase 1: Essential data that everyone needs (parallel)
const essentialDataPromises = [
fetchFeaturedProducts(8),
fetchCategories(),
fetchActivePromotions(),
fetchSiteStatistics()
] as const;
// π Phase 2: User-specific data (only if user is logged in)
const userSpecificPromises = userId ? [
fetchUserProfile(userId),
fetchUserRecommendations(userId, 6)
] as const : [
Promise.resolve(null),
Promise.resolve([])
] as const;
// π Execute both phases in parallel
const [
[featuredProducts, categories, promotions, siteStats],
[userInfo, userRecommendations]
] = await Promise.all([
Promise.all(essentialDataPromises),
Promise.all(userSpecificPromises)
]);
console.log('β
Homepage data loaded successfully!');
return {
featuredProducts,
categories,
userRecommendations,
promotions,
userInfo,
siteStats
};
};
// π οΈ API service functions
const fetchFeaturedProducts = async (limit: number): Promise<Product[]> => {
console.log('β Fetching featured products...');
await new Promise(resolve => setTimeout(resolve, 400));
return Array.from({ length: limit }, (_, i) => ({
id: `featured_${i + 1}`,
name: `Featured Product ${i + 1}`,
price: Math.floor(Math.random() * 300) + 50,
stock: Math.floor(Math.random() * 50) + 10,
category: ['Electronics', 'Clothing', 'Books', 'Home'][Math.floor(Math.random() * 4)]
}));
};
const fetchCategories = async (): Promise<Category[]> => {
console.log('π Fetching categories...');
await new Promise(resolve => setTimeout(resolve, 200));
return [
{ id: 'electronics', name: 'Electronics', productCount: 1543, imageUrl: '/images/electronics.jpg' },
{ id: 'clothing', name: 'Clothing', productCount: 2891, imageUrl: '/images/clothing.jpg' },
{ id: 'books', name: 'Books', productCount: 987, imageUrl: '/images/books.jpg' },
{ id: 'home', name: 'Home & Garden', productCount: 1234, imageUrl: '/images/home.jpg' }
];
};
const fetchUserRecommendations = async (userId: string, limit: number): Promise<Product[]> => {
console.log('π― Fetching personalized recommendations...');
await new Promise(resolve => setTimeout(resolve, 350));
return Array.from({ length: limit }, (_, i) => ({
id: `rec_${userId}_${i + 1}`,
name: `Recommended for You ${i + 1}`,
price: Math.floor(Math.random() * 200) + 30,
stock: Math.floor(Math.random() * 30) + 5,
category: 'Recommended'
}));
};
const fetchActivePromotions = async (): Promise<Promotion[]> => {
console.log('π Fetching active promotions...');
await new Promise(resolve => setTimeout(resolve, 150));
return [
{
id: 'promo_1',
title: 'Summer Sale! π',
description: 'Up to 50% off on electronics',
discountPercentage: 50,
validUntil: new Date(Date.now() + 7 * 24 * 60 * 60 * 1000) // 1 week from now
},
{
id: 'promo_2',
title: 'Free Shipping π',
description: 'Free shipping on orders over $99',
discountPercentage: 0,
validUntil: new Date(Date.now() + 30 * 24 * 60 * 60 * 1000) // 1 month from now
}
];
};
const fetchSiteStatistics = async (): Promise<SiteStatistics> => {
console.log('π Fetching site statistics...');
await new Promise(resolve => setTimeout(resolve, 100));
return {
totalProducts: 45678,
activeUsers: 12543,
ordersToday: 1847,
averageRating: 4.6
};
};
π¨ Error Handling with Promise.all
π‘οΈ Robust Error Management Strategies
// π― Problem: Promise.all fails fast - if any promise rejects, all fail
// π οΈ Solution: Implement resilient patterns for production apps
interface ApiResult<T> {
success: boolean;
data?: T;
error?: string;
source: string;
}
// π Safe wrapper for individual promises
const safePromise = <T>(
promise: Promise<T>,
source: string
): Promise<ApiResult<T>> => {
return promise
.then((data): ApiResult<T> => ({
success: true,
data,
source
}))
.catch((error): ApiResult<T> => ({
success: false,
error: error.message,
source
}));
};
// π₯ Resilient data loading with graceful degradation
const loadDashboardWithFallbacks = async (userId: string) => {
console.log('π‘οΈ Loading dashboard with error resilience...');
// π Wrap each API call for safe parallel execution
const results = await Promise.all([
safePromise(fetchUserProfile(userId), 'user-profile'),
safePromise(fetchUserActivity(userId), 'user-activity'),
safePromise(fetchUserNotifications(userId), 'notifications'),
safePromise(fetchUserSettings(userId), 'user-settings')
]);
// π Process results and handle failures gracefully
const [profileResult, activityResult, notificationsResult, settingsResult] = results;
// π Log any failures for monitoring
results.forEach(result => {
if (!result.success) {
console.warn(`β οΈ ${result.source} failed:`, result.error);
}
});
return {
profile: profileResult.success ? profileResult.data : getDefaultProfile(userId),
activity: activityResult.success ? activityResult.data : getDefaultActivity(),
notifications: notificationsResult.success ? notificationsResult.data : [],
settings: settingsResult.success ? settingsResult.data : getDefaultSettings(),
// π Metadata for debugging and monitoring
loadStatus: {
profileLoaded: profileResult.success,
activityLoaded: activityResult.success,
notificationsLoaded: notificationsResult.success,
settingsLoaded: settingsResult.success,
successCount: results.filter(r => r.success).length,
totalCount: results.length
}
};
};
// π Advanced retry logic with Promise.all
interface RetryConfig {
maxAttempts: number;
delayMs: number;
backoffMultiplier: number;
}
const retryablePromiseAll = async <T>(
promises: (() => Promise<T>)[],
config: RetryConfig = { maxAttempts: 3, delayMs: 1000, backoffMultiplier: 2 }
): Promise<T[]> => {
let attempt = 0;
while (attempt < config.maxAttempts) {
try {
console.log(`π Attempt ${attempt + 1}/${config.maxAttempts}`);
// π Execute all promises in parallel
const results = await Promise.all(promises.map(promiseFn => promiseFn()));
console.log('β
All operations succeeded!');
return results;
} catch (error) {
attempt++;
if (attempt >= config.maxAttempts) {
console.error('π₯ All retry attempts failed');
throw error;
}
// β±οΈ Exponential backoff delay
const delay = config.delayMs * Math.pow(config.backoffMultiplier, attempt - 1);
console.log(`β³ Retrying in ${delay}ms...`);
await new Promise(resolve => setTimeout(resolve, delay));
}
}
throw new Error('Retry logic error'); // Should never reach here
};
// π οΈ Default fallback data functions
const getDefaultProfile = (userId: string): UserProfile => ({
id: userId,
name: 'Guest User',
email: '',
avatar: '/default-avatar.png',
joinedAt: new Date()
});
const getDefaultActivity = (): UserActivity => ({
lastLogin: new Date(),
sessionsThisWeek: 0,
actionsPerformed: 0,
favoriteFeatures: []
});
const getDefaultSettings = () => ({
theme: 'light',
language: 'en',
notifications: true
});
const fetchUserSettings = async (userId: string) => {
await new Promise(resolve => setTimeout(resolve, 120));
return {
theme: 'dark',
language: 'en',
notifications: true,
privacy: 'private'
};
};
β‘ Performance Optimization Patterns
ποΈ Batching and Chunking Strategies
// π― Handle large datasets efficiently with batching
interface BatchResult<T> {
results: T[];
processedCount: number;
totalTime: number;
batchTimes: number[];
}
// π¦ Process large arrays in optimized batches
const processBatchesInParallel = async <T, R>(
items: T[],
processor: (item: T) => Promise<R>,
batchSize: number = 10,
maxConcurrency: number = 3
): Promise<BatchResult<R>> => {
console.log(`π¦ Processing ${items.length} items in batches of ${batchSize}, max ${maxConcurrency} concurrent batches`);
const startTime = Date.now();
const batchTimes: number[] = [];
const results: R[] = [];
// πͺ Split items into batches
const batches: T[][] = [];
for (let i = 0; i < items.length; i += batchSize) {
batches.push(items.slice(i, i + batchSize));
}
// π Process batches with controlled concurrency
for (let i = 0; i < batches.length; i += maxConcurrency) {
const batchGroup = batches.slice(i, i + maxConcurrency);
console.log(`π Processing batch group ${Math.floor(i / maxConcurrency) + 1}/${Math.ceil(batches.length / maxConcurrency)}`);
const batchStartTime = Date.now();
// π Process current batch group in parallel
const batchResults = await Promise.all(
batchGroup.map(batch =>
Promise.all(batch.map(item => processor(item)))
)
);
const batchEndTime = Date.now();
const batchTime = batchEndTime - batchStartTime;
batchTimes.push(batchTime);
// π Flatten and collect results
batchResults.forEach(batchResult => {
results.push(...batchResult);
});
console.log(`β
Batch group completed in ${batchTime}ms`);
}
const totalTime = Date.now() - startTime;
return {
results,
processedCount: results.length,
totalTime,
batchTimes
};
};
// π― Example: Process user uploads in optimized batches
interface FileUpload {
id: string;
filename: string;
size: number;
type: string;
}
interface ProcessedFile {
id: string;
filename: string;
processedSize: number;
thumbnailUrl: string;
status: 'success' | 'error';
}
const processFileUploads = async (files: FileUpload[]): Promise<BatchResult<ProcessedFile>> => {
const fileProcessor = async (file: FileUpload): Promise<ProcessedFile> => {
// π Simulate file processing time based on file size
const processingTime = Math.min(file.size / 1000, 2000); // Max 2 seconds
await new Promise(resolve => setTimeout(resolve, processingTime));
// π² Simulate occasional failures
if (Math.random() < 0.05) { // 5% failure rate
return {
id: file.id,
filename: file.filename,
processedSize: 0,
thumbnailUrl: '',
status: 'error'
};
}
return {
id: file.id,
filename: file.filename,
processedSize: file.size,
thumbnailUrl: `https://cdn.example.com/thumbnails/${file.id}.jpg`,
status: 'success'
};
};
return processBatchesInParallel(
files,
fileProcessor,
5, // Process 5 files per batch
2 // Maximum 2 concurrent batches
);
};
π§ Promise.all Variations and Alternatives
// π― Different Promise utilities for different scenarios
// π Promise.allSettled - Never fails, returns all results
const loadDataWithAllSettled = async (userId: string) => {
console.log('π Using Promise.allSettled for complete data loading...');
const results = await Promise.allSettled([
fetchUserProfile(userId),
fetchUserActivity(userId),
fetchUserNotifications(userId)
]);
// π Process all results, successful and failed
const processedResults = results.map((result, index) => {
const sources = ['profile', 'activity', 'notifications'];
if (result.status === 'fulfilled') {
return {
source: sources[index],
success: true,
data: result.value
};
} else {
return {
source: sources[index],
success: false,
error: result.reason.message
};
}
});
return processedResults;
};
// π₯ Promise.race - First to complete wins
const loadWithFallback = async (userId: string) => {
console.log('π₯ Using Promise.race for fast fallback...');
// π Try primary and backup data sources simultaneously
const primarySource = fetchUserProfile(userId);
const backupSource = new Promise<UserProfile>((resolve) => {
setTimeout(() => {
resolve(getDefaultProfile(userId));
}, 1000); // Fallback after 1 second
});
const result = await Promise.race([primarySource, backupSource]);
console.log('β‘ Fastest source responded!');
return result;
};
// π― Custom Promise utility: Promise.some (first N to succeed)
const promiseSome = async <T>(
promises: Promise<T>[],
count: number
): Promise<T[]> => {
return new Promise((resolve, reject) => {
const results: T[] = [];
const errors: Error[] = [];
let completed = 0;
if (count <= 0 || count > promises.length) {
reject(new Error('Invalid count parameter'));
return;
}
promises.forEach((promise, index) => {
promise
.then((result) => {
results.push(result);
completed++;
if (results.length >= count) {
resolve(results.slice(0, count));
}
})
.catch((error) => {
errors.push(error);
completed++;
// If too many failed to reach target count
if (completed === promises.length && results.length < count) {
reject(new Error(`Only ${results.length} of ${count} required promises succeeded`));
}
});
});
});
};
// π― Example: Load recommendations from multiple sources, need any 3
const loadRecommendations = async (userId: string) => {
console.log('π― Loading recommendations from multiple sources...');
const recommendationSources = [
fetchPersonalizedRecommendations(userId),
fetchTrendingRecommendations(),
fetchCategoryRecommendations(userId),
fetchCollaborativeRecommendations(userId),
fetchPopularRecommendations()
];
try {
// π― Get first 3 successful recommendations
const recommendations = await promiseSome(recommendationSources, 3);
console.log('β
Got recommendations from 3 sources!');
return recommendations.flat().slice(0, 12); // Take top 12 total
} catch (error) {
console.warn('β οΈ Could not get enough recommendation sources');
return [];
}
};
// π οΈ Mock recommendation functions
const fetchPersonalizedRecommendations = async (userId: string): Promise<Product[]> => {
await new Promise(resolve => setTimeout(resolve, 300));
return Array.from({ length: 4 }, (_, i) => ({
id: `personal_${i}`,
name: `Personal Rec ${i + 1}`,
price: 50 + i * 10,
stock: 10,
category: 'Personal'
}));
};
const fetchTrendingRecommendations = async (): Promise<Product[]> => {
await new Promise(resolve => setTimeout(resolve, 200));
return Array.from({ length: 4 }, (_, i) => ({
id: `trending_${i}`,
name: `Trending Item ${i + 1}`,
price: 40 + i * 15,
stock: 15,
category: 'Trending'
}));
};
const fetchCategoryRecommendations = async (userId: string): Promise<Product[]> => {
await new Promise(resolve => setTimeout(resolve, 250));
return Array.from({ length: 4 }, (_, i) => ({
id: `category_${i}`,
name: `Category Match ${i + 1}`,
price: 60 + i * 5,
stock: 8,
category: 'Category'
}));
};
const fetchCollaborativeRecommendations = async (userId: string): Promise<Product[]> => {
// π² Sometimes this source fails
if (Math.random() < 0.3) {
throw new Error('Collaborative filtering service unavailable');
}
await new Promise(resolve => setTimeout(resolve, 400));
return Array.from({ length: 4 }, (_, i) => ({
id: `collab_${i}`,
name: `Similar Users Like ${i + 1}`,
price: 70 + i * 8,
stock: 12,
category: 'Collaborative'
}));
};
const fetchPopularRecommendations = async (): Promise<Product[]> => {
await new Promise(resolve => setTimeout(resolve, 150));
return Array.from({ length: 4 }, (_, i) => ({
id: `popular_${i}`,
name: `Popular Choice ${i + 1}`,
price: 45 + i * 12,
stock: 20,
category: 'Popular'
}));
};
π― Practical Exercise: Social Media Dashboard
// π― Complete social media dashboard with parallel data loading
interface SocialPost {
id: string;
content: string;
author: string;
likes: number;
shares: number;
comments: number;
createdAt: Date;
platform: 'twitter' | 'linkedin' | 'instagram' | 'facebook';
}
interface AnalyticsData {
totalFollowers: number;
engagementRate: number;
topPost: SocialPost;
growthPercentage: number;
}
interface UserMentions {
mentionCount: number;
sentiment: 'positive' | 'neutral' | 'negative';
recentMentions: Array<{
text: string;
source: string;
timestamp: Date;
}>;
}
interface SocialDashboard {
recentPosts: SocialPost[];
analytics: AnalyticsData;
mentions: UserMentions;
scheduledPosts: SocialPost[];
competitors: Array<{
name: string;
followers: number;
engagement: number;
}>;
performance: {
postsThisWeek: number;
bestPerformingTime: string;
recommendedHashtags: string[];
};
}
// π± Load complete social media dashboard
const loadSocialMediaDashboard = async (userId: string): Promise<SocialDashboard> => {
console.log('π± Loading social media dashboard...');
// π Phase 1: Essential data (parallel)
const essentialPromises = [
fetchRecentPosts(userId, 20),
fetchAnalyticsData(userId),
fetchUserMentions(userId)
] as const;
// π Phase 2: Supplementary data (parallel)
const supplementaryPromises = [
fetchScheduledPosts(userId),
fetchCompetitorData(userId),
fetchPerformanceInsights(userId)
] as const;
// π Execute both phases simultaneously
const [
[recentPosts, analytics, mentions],
[scheduledPosts, competitors, performance]
] = await Promise.all([
Promise.all(essentialPromises),
Promise.all(supplementaryPromises)
]);
console.log('β
Social media dashboard loaded successfully!');
return {
recentPosts,
analytics,
mentions,
scheduledPosts,
competitors,
performance
};
};
// π οΈ Social media API functions
const fetchRecentPosts = async (userId: string, limit: number): Promise<SocialPost[]> => {
console.log('π Fetching recent posts...');
await new Promise(resolve => setTimeout(resolve, 400));
const platforms: SocialPost['platform'][] = ['twitter', 'linkedin', 'instagram', 'facebook'];
return Array.from({ length: limit }, (_, i) => ({
id: `post_${i + 1}`,
content: `This is social media post #${i + 1}. Great content here! π`,
author: `User ${userId}`,
likes: Math.floor(Math.random() * 500) + 10,
shares: Math.floor(Math.random() * 100) + 2,
comments: Math.floor(Math.random() * 50) + 1,
createdAt: new Date(Date.now() - i * 3600000), // Posts from recent hours
platform: platforms[Math.floor(Math.random() * platforms.length)]
}));
};
const fetchAnalyticsData = async (userId: string): Promise<AnalyticsData> => {
console.log('π Fetching analytics data...');
await new Promise(resolve => setTimeout(resolve, 300));
return {
totalFollowers: 15247,
engagementRate: 4.8,
topPost: {
id: 'top_post_1',
content: 'This was our most engaging post this month! π',
author: `User ${userId}`,
likes: 1247,
shares: 89,
comments: 156,
createdAt: new Date(Date.now() - 2 * 24 * 3600000), // 2 days ago
platform: 'linkedin'
},
growthPercentage: 12.5
};
};
const fetchUserMentions = async (userId: string): Promise<UserMentions> => {
console.log('π’ Fetching user mentions...');
await new Promise(resolve => setTimeout(resolve, 250));
return {
mentionCount: 47,
sentiment: 'positive',
recentMentions: [
{
text: `Great insights from @${userId}! Really helpful content.`,
source: 'Twitter',
timestamp: new Date(Date.now() - 3600000) // 1 hour ago
},
{
text: `Thanks @${userId} for the amazing tutorial!`,
source: 'LinkedIn',
timestamp: new Date(Date.now() - 7200000) // 2 hours ago
},
{
text: `Love the work that @${userId} is doing in this space.`,
source: 'Instagram',
timestamp: new Date(Date.now() - 10800000) // 3 hours ago
}
]
};
};
const fetchScheduledPosts = async (userId: string): Promise<SocialPost[]> => {
console.log('β° Fetching scheduled posts...');
await new Promise(resolve => setTimeout(resolve, 200));
return Array.from({ length: 8 }, (_, i) => ({
id: `scheduled_${i + 1}`,
content: `Scheduled post #${i + 1} - Coming soon! π
`,
author: `User ${userId}`,
likes: 0,
shares: 0,
comments: 0,
createdAt: new Date(Date.now() + (i + 1) * 24 * 3600000), // Future posts
platform: ['twitter', 'linkedin', 'instagram', 'facebook'][i % 4] as SocialPost['platform']
}));
};
const fetchCompetitorData = async (userId: string) => {
console.log('π Fetching competitor data...');
await new Promise(resolve => setTimeout(resolve, 350));
return [
{ name: 'Competitor A', followers: 23000, engagement: 3.2 },
{ name: 'Competitor B', followers: 18500, engagement: 5.1 },
{ name: 'Competitor C', followers: 31000, engagement: 2.8 },
{ name: 'Competitor D', followers: 12000, engagement: 6.4 }
];
};
const fetchPerformanceInsights = async (userId: string) => {
console.log('π Fetching performance insights...');
await new Promise(resolve => setTimeout(resolve, 180));
return {
postsThisWeek: 12,
bestPerformingTime: '2:00 PM - 4:00 PM',
recommendedHashtags: ['#technology', '#innovation', '#leadership', '#growth', '#success']
};
};
// π Usage example with performance monitoring
const demonstrateDashboardLoading = async () => {
const startTime = Date.now();
try {
const dashboard = await loadSocialMediaDashboard('user_123');
const endTime = Date.now();
const totalTime = endTime - startTime;
console.log(`β‘ Dashboard loaded in ${totalTime}ms`);
console.log(`π± Loaded ${dashboard.recentPosts.length} recent posts`);
console.log(`π Analytics: ${dashboard.analytics.totalFollowers} followers, ${dashboard.analytics.engagementRate}% engagement`);
console.log(`π’ ${dashboard.mentions.mentionCount} mentions with ${dashboard.mentions.sentiment} sentiment`);
console.log(`β° ${dashboard.scheduledPosts.length} scheduled posts`);
console.log(`π Tracking ${dashboard.competitors.length} competitors`);
return dashboard;
} catch (error) {
console.error('π₯ Dashboard loading failed:', error);
throw error;
}
};
π Conclusion
Congratulations! π Youβve mastered the art of parallel async operations with Promise.all in TypeScript. You now have the skills to:
- β Execute operations in parallel for maximum performance
- π Optimize API calls and reduce total loading times
- π‘οΈ Handle errors gracefully with resilient patterns
- π¦ Process large datasets efficiently with batching
- ποΈ Build responsive applications that delight users
Youβve learned to harness the full power of concurrent programming while maintaining type safety and error resilience. Keep practicing these patterns, and youβll be the performance optimization hero your team needs! π
π Next Steps
Ready to level up further? Check out these advanced topics:
- β‘ Async/Await syntax for cleaner parallel operations
- π Promise.allSettled() for handling mixed results
- π Promise.race() for timeout and fallback patterns
- π― Worker Threads for true parallel processing
- π‘ GraphQL DataLoader for intelligent request batching