BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Google Cloud Adds Scalable Vector Search to Memorystore for Valkey & Redis Cluster

Google Cloud Adds Scalable Vector Search to Memorystore for Valkey & Redis Cluster

Google Cloud has introduced scalable vector-search capabilities to its Memorystore for Valkey and Redis Cluster. This update allows developers to perform vector searches at ultra-low latencies over billions of vectors.

This enhancement is particularly beneficial for applications that rely on generative AI, such as retrieval-augmented generation (RAG), recommendation systems, and semantic search.

The update leverages the ability to partition vector indices across nodes in a cluster. Each node contains a partition of the index that corresponds to its portion of the keyspace, enabling the cluster to handle billions of vectors while maintaining single-digit milliseconds latency and over 99% recall. This architecture not only accelerates index build times linearly as nodes are added but also optimizes search performance – logarithmically for hierarchical navigable small-world (HNSW) searches and linearly for brute-force searches.

Developers can use these new abilities to scale out their clusters to 250 shards, storing billions of vectors in a single instance. This scalability is essential for enterprise applications that need to perform semantic searches over extensive datasets.

In addition to scalability, the update introduces support for hybrid queries. Hybrid queries enable developers to combine vector searches with filters on numeric and tag fields. This functionality is particularly useful for fine-tuning search results based on specific criteria. For example, an online clothing retailer can use hybrid search to recommend similar items while filtering results based on clothing type and price range.

To implement a hybrid query, developers can create a new vector index with additional fields for filtering:

FT.CREATE inventory_index SCHEMA embedding VECTOR HNSW 6 DIM 128 TYPE FLOAT32 DISTANCE_METRIC L2 clothing_type TAG clothing_price_usd NUMERIC

This creates an index ‘inventory_index’ with a vector field `embedding` for the semantic embedding of the clothing item, a tag field `clothing_type` for the type of article of clothing (e.g., “dress” or “hat”), and a numeric field `clothing_price_usd` for the price of the article of clothing.

To perform a hybrid query on ‘inventory_index’:

FT.SEARCH inventory_index “(@clothing_type:{dress} @clothing_price_usd:[100-200])=>[KNN 10 @embedding $query_vector]“ PARAMS 2 query_vector “...” DIALECT 2

This query retrieved 10 results filtered by clothing type “dress” and a price range of 100-200, combined with a vector similarity search.

Some community members have cautioned about adopting Redis’s vector search if this technology is not already deployed within an organization. For example, marr75 on Reddit, stated:

The better advice is probably to stick with your dominant data persistence and query technology, though. If that's RediSearch, stick with it. If it's not, don't pick it up for its vector search support which is fine but not best in class or state of the art.

Google Cloud is also contributing to the open-source community by donating its vector search capabilities to the Valkey key-value datastore. This initiative aims to enable Valkey developers to leverage vector search to create advanced generative AI applications.

In a recent Google announcement blog, Sanjeev Mohan, principal analyst at SanjMo and former Gartner VP, shared his perspective on Google’s contributions:

Valkey is important for continuing to advance community-led efforts to provide feature-rich, open-source database alternatives. The launch of Valkey support in Memorystore is yet another example of Google’s dedication to providing truly open and accessible solutions for users. Their contributions to Valkey not only benefit developers seeking flexibility, but also strengthens the broader open-source ecosystem.

Rapid and precise vector searches are relevant in industries such as e-commerce, where understanding customer preference and delivering tailored recommendations can be beneficial.

About the Author

Rate this Article

Adoption
Style

BT