Leveraging the power of hardware vectorization and pure serverless compute for fast, low cost queries
Sneller has built an Elastic, OpenSearch and SQL query engine that leverages vector hardware acceleration. Every Intel server processor shipped since 2018 and current AMD processors include AVX-512 cores. AVX-512 was built for numerical computation and despite years of research no other company has built a query engine around it. Sneller’s prime innovation is the comprehensive use of AVX-512 to accelerate queries. This allows Sneller to leverage 16x the processing power compared to others for the same price. Sneller was accepted into Intel’s Startup Disruptor Program based on this innovation.
Leveraging our AVX-512 innovations, Sneller built a full serverless engine that avoids using local storage on compute nodes. All compute state is maintained in distributed DRAM. This enables very fast and dynamic scaling of compute nodes based on incoming query traffic. Sneller reads all data directly from object storage and so can scale both compute and storage quickly, easily and with very high reliability. The combination of AVX-512, a very efficient serverless architecture and scanning efficiencies enables Sneller to achieve a 10x+ price/performance advantage compared to engines such as AWS OpenSearch and Elasticsearch. This has been proven in both head to head benchmarks and real-life customer testing.