Skip to content

Performing vector search

After deploying your index, Orama will distribute it to over 300 global points of presence across more than 100 countries worldwide. This will guarantee the lowest possible latency for any search query, at any scale.

At the time of this writing, you can execute search queries using our official JavaScript SDK.

This SDK manages connection, cache, telemetry, and type-safety for all your search operations. It is the official method for communicating with Orama Cloud.

Make sure you have the Orama SDK installed to start performing vector search at the edge!

Orama Cloud enables vector search by default. This means that you can search through your indexes using vectors generated by Orama’s embedding models.

As an alternative, you can provide your own vectors while inserting new documents into an Orama index, or use OpenAI’s embedding models to generate vectors for your documents.

Once you have at least one index containing vectors, you can perform vector search by using the search function:

import { OramaClient } from "@oramacloud/client";
const client = new OramaClient({
endpoint: "",
api_key: "",
});
const results = await client.search({
term: "Super Mario videogame",
mode: "vector",
where: {
price: {
lt: 19.99,
},
},
});

Orama will automatically convert your search term, for instance, "Super Mario videogame", into an embedding using your OpenAI API Key. It will then search through your vectors and ultimately return the full documents in their original format.