Serverless vector search for AI

with full fledged SQL support for unstructured data.

Built for AI

High-performance vector searches accelerated by AVX-512.

Zero cost direct ingest on S3 without any indexing.

Seamlessly combine your vectors and meta-data.

Fast and Serverless

Leverage Sneller’s handwritten SIMD/AVX-512 assembly.

Query TBs per second using standard SQL.

No need to manage infrastructure.

Simple and Scalable

Scale to petabyte-size tables on S3 object storage.

Low latency (~3 sec) direct ingest from S3.

Keep your data in your own S3 buckets.

Easy and Open Source

Ingest any (JSON) data without ETL or defining schemas.

Robust support for SQL with many useful extensions.

Simple REST API for all your SQL queries.

Built by developers, for developers.

Playground

Our blog

The Sneller development team regularly posts in-depth information about the product and its internals.

Serverless Vector Search

by Phil Hofer on August 16, 2023

Sneller’s serverless vector search eliminates the need for capacity planning and expensive migrations.

Continue reading

1 Billion Records on Grafana

by Frank Wessels on August 1, 2023

See how Sneller powers a Grafana dashboard with 1 billion records from the GitHub archive data.

Continue reading

Semantic Search with SQL

by Phil Hofer on June 21, 2023

Learn how Sneller makes it easy to perform semantic search using SQL for AI-powered applications.

Continue reading

How Sneller Cloud Runs on Itself

by Phil Hofer on June 19, 2023

Learn how we “eat our own dogfood” by using Sneller SQL to monitor Sneller Cloud.

Continue reading