tag

Bodo POCs explained:

What makes Bodo outperform Snowflake, Spark, and others

Recorded Jan 31st, 2023, 11AM PST

brian-2

Brian Garback,Director of Sales

webinar

When we say Bodo can speed up your big data processing by 20X while reducing compute costs by up to 90%, we mean it. Follow along as we walk through three customer case studies to show the true power of Bodo’s compute engine.

Accelerating retail pricing jobs while reducing Azure costs by $250k: For one company, operating expenses were increasing in-line with a tripling of their customer base. With a goal of reducing compute costs, Bodo designed and implemented an HPC architecture integrating Bodo, Snowflake, and Azure to decrease costs by 75% and speed up runtimes by 4x.

Saving $1M on large data engineering jobs on AWS: How Bodo became one company’s “pluggable compute engine”—complementing their existing architecture—to drive down the $2.8M in costs caused  by compute for large data transforms.

Migrating jobs from Spark to Bodo to reduce costs by $37k/year for one query alone: Bodo helped another company effectively triple the size of a shared, on-prem, 100-node cluster by moving jobs from Spark to Bodo, enabling 700+ data scientists to run data engineering and data science jobs at larger scale with less contention.

All this and even more will be covered. For each POC,  we’ll walk through the challenges, example queries, and performance results.

Speakers

brian-2

Brian Garback,
Director of Sales

Brian Garback is laser-focused on engaging, winning, and onboarding customers, as well as revenue operations for Bodo. Brian utilizes his 10+ years of experience in C-level strategy consulting and background at IBM building a nationally-managed CRM practice to build and execute customer-focused roadmaps.

Register to watch recording