Check out v4 of the Fauna CLI
v4 of the Fauna CLI is now in beta. The new version introduces enhancements to the developer experience, including an improved authentication workflow. To get started, check out the CLI v4 quick start. |
Fauna v10 JavaScript client driver (current)
Version: 2.4.1 | Repository: fauna/fauna-js |
---|
Fauna’s JavaScript client driver lets you run FQL queries from JavaScript or TypeScript applications.
This guide shows how to set up the driver and use it to run FQL queries.
This driver can only be used with FQL v10. It’s not compatible with earlier versions of FQL. To use earlier FQL versions, use the faunadb package. |
Supported runtimes
The driver supports the following runtime environments.
Cloud providers
-
Cloudflare Workers
-
AWS Lambda (See AWS Lambda connections)
-
Netlify
-
Vercel
Installation
The driver is available on npm. Install it using your preferred package manager:
npm install fauna
Browsers can import the driver using a CDN link:
<script type="module">
import * as fauna from "https://cdn.jsdelivr.net/npm/fauna@latest/dist/browser/index.js";
</script>
API reference
API reference documentation for the driver is available at https://fauna.github.io/fauna-js/.
Sample app
For a practical example, check out the JavaScript sample app.
This sample app is an e-commerce application that uses Node.js and the Fauna JavaScript driver. The source code includes comments highlighting best practices for using the driver and composing FQL queries.
Basic usage
The following application:
-
Initializes a client instance to connect to Fauna
-
Composes a basic FQL query using an
fql
string template -
Runs the query using
query()
import { Client, fql, FaunaError } from "fauna";
// Use `require` for CommonJS:
// const { Client, fql, FaunaError } = require('fauna');
// Initialize the client to connect to Fauna
const client = new Client({
secret: 'FAUNA_SECRET'
});
try {
// Compose a query
const query = fql`
Product.sortedByPriceLowToHigh() {
name,
description,
price
}`;
// Run the query
const response = await client.query(query);
console.log(response.data);
} catch (error) {
if (error instanceof FaunaError) {
console.log(error);
}
} finally {
// Clean up any remaining resources
client.close();
}
Connect to Fauna
Each Fauna query is an independently authenticated request to the Core HTTP API’s Query endpoint. You authenticate with Fauna using an authentication secret.
Get an authentication secret
Fauna supports several secret types. For testing, you can create a key, which is a type of secret:
-
Log in to the Fauna Dashboard.
-
On the Explorer page, create a database.
-
In the database’s Keys tab, click Create Key.
-
Choose a Role of server.
-
Click Save.
-
Copy the Key Secret. The secret is scoped to the database.
Initialize a client
To send query requests to Fauna, initialize a Client
instance using a Fauna
authentication secret:
const client = new Client({
secret: 'FAUNA_SECRET'
});
If not specified, secret
defaults to the FAUNA_SECRET
environment variable.
For other configuration options, see Client configuration.
Connect to a child database
A scoped key lets you use a parent database’s admin key to send query requests to its child databases.
For example, if you have an admin key for a parent database and want to
connect to a child database named childDB
, you can create a scoped key using
the following format:
// Scoped key that impersonates an `admin` key for
// the `childDB` child database.
fn...:childDB:admin
You can then initialize a Client
instance using the scoped key:
const client = new Client({
secret: 'fn...:childDB:admin'
});
Multiple connections
You can use a single client instance to run multiple asynchronous queries at once. The driver manages HTTP connections as needed. Your app doesn’t need to implement connection pools or other connection management strategies.
You can create multiple client instances to connect to Fauna using different credentials or client configurations.
AWS Lambda connections
AWS Lambda freezes, thaws, and reuses execution environments for Lambda functions. See Lambda execution environment.
When an execution environment is thawed, Lambda only runs the function’s handler code. Objects declared outside of the handler method remain initialized from before the freeze. Lambda doesn’t re-run initialization code outside the handler.
Fauna drivers keep socket connections that can time out during long freezes,
causing ECONNRESET
errors when thawed.
To prevent timeouts, create Fauna client connections inside function handlers. Fauna drivers use lightweight HTTP connections. You can create new connections for each request while maintaining good performance.
Run FQL queries
Use fql
string templates to compose FQL queries. Run the queries using
query()
:
const query = fql`Product.sortedByPriceLowToHigh()`;
client.query(query)
By default, query()
uses query options from the
Client configuration. You can pass options to query()
to override
these defaults. See Query options.
You can only compose FQL queries using string templates.
Variable interpolation
Use ${}
to pass native JavaScript variables to fql
queries:
// Create a native JS var
const collectionName = "Product";
// Pass the var to an FQL query
const query = fql`
let collection = Collection(${collectionName})
collection.sortedByPriceLowToHigh()`;
client.query(query);
The driver encodes interpolated variables to an appropriate FQL type and uses the wire protocol to pass the query to the Core HTTP API’s Query endpoint. This helps prevent injection attacks.
Query composition
You can use variable interpolation to pass FQL string templates as query fragments to compose an FQL query:
// Create a reusable query fragment.
const product = fql`Product.byName("pizza").first()`;
// Use the fragment in another FQL query.
const query = fql`
let product = ${product}
product {
name,
price
}`;
client.query(query);
Pagination
Use paginate()
to iterate through a Set that contains more than one page of
results. paginate()
accepts the same Query options as query()
.
// Adjust `pageSize()` size as needed.
const query = fql`
Product.sortedByPriceLowToHigh()
.pageSize(2)`;
const pages = client.paginate(query);
for await (const products of pages) {
for (const product of products) {
console.log(product)
// ...
}
}
Use flatten()
to get paginated results as a single, flat array:
const pages = client.paginate(query);
for await (const product of pages.flatten()) {
console.log(product)
}
Query stats
Successful query responses and ServiceError
errors include
query stats:
try {
const response = await client.query(fql`"Hello world"`);
console.log(response.stats);
} catch (error) {
if (error instanceof ServiceError) {
const info = error.queryInfo;
const stats = info.stats;
}
}
Output:
{
compute_ops: 1,
read_ops: 0,
write_ops: 0,
query_time_ms: 0,
contention_retries: 0,
storage_bytes_read: 0,
storage_bytes_write: 0,
rate_limits_hit: [],
attempts: 1
}
TypeScript support
The driver supports TypeScript. For example, you can apply a type parameter to your FQL query results:
import { fql, Client, type QuerySuccess } from "fauna";
const client = new Client({
secret: 'FAUNA_SECRET'
});
type Customer = {
name: string;
email: string;
};
const query = fql`{
name: "Alice Appleseed",
email: "alice.appleseed@example.com",
}`;
const response: QuerySuccess<Customer> = await client.query<Customer>(query);
const customer_doc: Customer = response.data;
console.assert(customer_doc.name === "Alice Applesee");
console.assert(customer_doc.email === "alice.appleseed@example.com");
Alternatively, you can apply a type parameter directly to your fql
statements
and Client
methods will infer your return types.
Due to backwards compatibility, if a type parameter is provided to a Client
method, the provided type will override the inferred type from your query.
const query = fql<User>`{
name: "Alice",
email: "alice@site.example",
}`;
// Response will be typed as `QuerySuccess<User>`.
const response = await client.query(query);
// `userDoc` will be automatically inferred as `User`.
const userDoc = response.data;
console.assert(userDoc.name === "Alice");
console.assert(userDoc.email === "alice@site.example");
client.close();
Client configuration
The Client
instance comes with reasonable configuration defaults. We recommend
using the defaults in most cases.
If needed, you can configure the client to override the defaults. This also lets you set default Query options.
import { Client, endpoints } from "fauna";
const config = {
// Configure the client
client_timeout_buffer_ms: 5000,
endpoint: endpoints.default,
fetch_keepalive: false,
http2_max_streams: 100,
http2_session_idle_ms: 5000,
secret: "FAUNA_SECRET",
// Set default query options
format: "tagged",
linearized: false,
long_type: "number",
max_attempts: 3,
max_backoff: 20,
max_contention_retries: 5,
query_tags: { tag: "value" },
query_timeout_ms: 60_000,
traceparent: "00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00",
typecheck: true,
};
const client = new Client(config);
For supported properties, see ClientConfiguration in the API reference.
Environment variables
By default, secret
and endpoint
default to the respective FAUNA_SECRET
and
FAUNA_ENDPOINT
environment variables.
For example, if you set the following environment variables:
export FAUNA_SECRET=FAUNA_SECRET
export FAUNA_ENDPOINT=https://db.fauna.com/
You can initialize the client with a default configuration:
const client = new Client();
Retries
By default, the client automatically retries query requests that return a
limit_exceeded
error code. Retries
use an exponential backoff.
Use the Client configuration's max_backoff
property to set the maximum
time between retries. Similarly, use max_attempts
to set the maximum number of
retry attempts.
Query options
The Client configuration sets default query options for the following methods:
-
query()
-
paginate()
You can pass a QueryOptions
object to override these defaults:
const options = {
arguments: { name: "Alice" },
format: "tagged",
linearized: false,
long_type: "number",
max_contention_retries: 5,
query_tags: { tag: "value" },
query_timeout_ms: 60_000,
traceparent: "00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00",
typecheck: true,
};
client.query(fql`"Hello, #{name}!"`, options);
For supported properties, see QueryOptions in the API reference.
Event Feeds
The driver supports Event Feeds. An Event Feed asynchronously polls an event source for paginated events.
To use Event Feeds, you must have a Pro or Enterprise plan.
Request an Event Feed
To get an event source, append
set.eventSource()
or
set.eventsOn()
to a
supported Set.
To get paginated events, pass the event source to
feed()
:
const response = await client.query(fql`
let set = Product.all()
{
initialPage: set.pageSize(10),
eventSource: set.eventSource()
}
`);
const { initialPage, eventSource } = response.data;
const feed = client.feed(eventSource);
If changes occur between the creation of the event source and the Event Feed request, the feed replays and emits any related events.
You can also pass a query that produces an event source directly to
feed()
:
const query = fql`Product.all().eventsOn(.price, .stock)`;
const feed = client.feed(query);
In most cases, you’ll get events after a specific event cursor or start time.
Get events after a specific start time
When you first poll an event source using an Event Feed, you usually include a
start_ts
(start timestamp) in the
FeedClientConfiguration
object that’s passed to
feed()
. The request returns events that occurred after the specified
timestamp (exclusive).
start_ts
is an integer representing a time in microseconds since the Unix
epoch:
// Calculate timestamp for 10 minutes ago
const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000);
// Convert to microseconds
const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000;
const options: FeedClientConfiguration = {
start_ts: startTs
};
const feed = client.feed(fql`Product.all().eventSource()`, options);
|
Get events after a specific event cursor
After the initial request, you usually get subsequent events using the cursor for the last page or event.
To get events after a cursor (exclusive), include the cursor
in the
FeedClientConfiguration
object that’s passed to
feed()
:
const options: FeedClientConfiguration = {
// Cursor for a previous page
cursor: "gsGabc456"
};
const feed = client.feed(fql`Product.all().eventSource()`, options);
Iterate on an Event Feed
feed()
returns a FeedClient
instance that acts as an AsyncIterator
. You
can use for await...of
to iterate through the pages of events:
const query = fql`Product.all().eventSource()`;
// Calculate timestamp for 10 minutes ago
const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000);
const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000;
const options: FeedClientConfiguration = {
start_ts: startTs
};
const feed = client.feed(query, options);
for await (const page of feed) {
console.log("Page stats", page.stats);
for (const event of page.events) {
switch (event.type) {
case "add":
// Do something on add
console.log("Add event: ", event);
break;
case "update":
// Do something on update
console.log("Update event: ", event);
break;
case "remove":
// Do something on remove
console.log("Remove event: ", event);
break;
}
}
}
Alternatively, use flatten()
to get events as a single, flat array:
const query = fql`Product.all().eventSource()`;
// Calculate timestamp for 10 minutes ago
const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000);
const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000;
const options = {
start_ts: startTs
};
const feed = client.feed(query, options);
for await (const event of feed.flatten()) {
switch (event.type) {
case "add":
// Do something on add
console.log("Add event: ", event);
break;
case "update":
// Do something on update
console.log("Update event: ", event);
break;
case "remove":
// Do something on remove
console.log("Remove event: ", event);
break;
}
}
The Event Feed iterator will stop when there are no more events to poll.
Each page includes a top-level cursor
. You can include the cursor in a
FeedClientConfiguration
object passed to feed()
to poll
for events after the cursor:
import { Client, fql } from "fauna";
const client = new Client();
async function processFeed(client, query, startTs = null, sleepTime = 300) {
let cursor = null;
while (true) {
// Only include `start_ts `if `cursor` is null. Otherwise, only include `cursor`.
const options = cursor === null ? { start_ts: startTs } : { cursor: cursor };
const feed = client.feed(query, options);
for await (const page of feed) {
for (const event of page.events) {
switch (event.type) {
case "add":
console.log("Add event: ", event);
break;
case "update":
console.log("Upodate event: ", event);
break;
case "remove":
console.log("Remove event: ", event);
break;
}
}
// Store the cursor of the last page
cursor = page.cursor;
}
// Clear startTs after the first request
startTs = null;
console.log(`Sleeping for ${sleepTime} seconds...`);
await new Promise(resolve => setTimeout(resolve, sleepTime * 1000));
}
}
const query = fql`Product.all().eventsOn(.price, .stock)`;
// Calculate timestamp for 10 minutes ago
const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000);
const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000;
processFeed(client, query, startTs);
If needed, you can store the cursor as a collection document:
import { Client, fql } from "fauna";
const client = new Client();
async function processFeed(client, query, startTs = null, sleepTime = 300) {
// Create the `Cursor` collection.
await client.query(
fql`
if (Collection.byName("Cursor").exists() == false) {
Collection.create({
name: "Cursor",
fields: {
name: {
signature: "String"
},
value: {
signature: "String?"
}
},
constraints: [
{
unique: [
{
field: ".name",
mva: false
}
]
}
],
indexes: {
byName: {
terms: [
{
field: ".name",
mva: false
}
]
}
},
})
} else {
null
}
`
);
// Create a `ProductInventory` document in the `Cursor` collection.
// The document holds the latest cursor.
await client.query(
fql`
if (Collection("Cursor").byName("ProductInventory").first() == null) {
Cursor.create({
name: "ProductInventory",
value: null
})
} else {
null
}
`
);
while (true) {
// Get existing cursor from the `Cursor` collection.
const cursorResponse = await client.query(
fql`Cursor.byName("ProductInventory").first()`
);
let cursor = cursorResponse.data?.value || null;
// Only include `start_ts `if `cursor` is null. Otherwise, only include `cursor`.
const options = cursor === null ? { start_ts: startTs } : { cursor: cursor };
const feed = client.feed(query, options);
for await (const page of feed) {
for (const event of page.events) {
switch (event.type) {
case "add":
console.log("Add event: ", event);
break;
case "update":
console.log("Update event: ", event);
break;
case "remove":
console.log("Remove event: ", event);
break;
}
}
// Store the cursor of the last page
cursor = page.cursor;
await client.query(
fql`
Cursor.byName("ProductInventory").first()!.update({
value: ${cursor}
})
`
);
console.log(`Cursor updated: ${cursor}`);
}
// Clear startTs after the first request
startTs = null;
console.log(`Sleeping for ${sleepTime} seconds...`);
await new Promise(resolve => setTimeout(resolve, sleepTime * 1000));
}
}
const query = fql`Product.all().eventsOn(.price, .stock)`;
// Calculate timestamp for 10 minutes ago
const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000);
const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000;
processFeed(client, query, startTs).catch(console.error);
Error handling
Exceptions can be raised at two different places:
-
While fetching a page
-
While iterating a page’s events
This distinction allows for you to ignore errors originating from event processing. For example:
const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000);
const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000;
const options = {
start_ts: startTs
};
const feed = client.feed(fql`
Product.all().map(.name.toUpperCase()).eventSource()
`, options);
try {
for await (const page of feed) {
// Pages will stop at the first error encountered.
// Therefore, its safe to handle an event failures
// and then pull more pages.
try {
for (const event of page.events) {
console.log("Event: ", event);
}
} catch (error: unknown) {
console.log("Feed event error: ", error);
}
}
} catch (error: unknown) {
console.log("Non-retryable error: ", error);
}
Each page’s cursor
contains the cursor for the page’s last successfully
processed event. If you’re using a loop to poll for changes, using the
cursor will skip any events that caused errors.
Event Feed options
The client configuration sets the default options for feed()
. You can pass a
FeedClientConfiguration
object to override these defaults:
const options: FeedClientConfiguration = {
long_type: "number",
max_attempts: 5,
max_backoff: 1000,
query_timeout_ms: 5000,
client_timeout_buffer_ms: 5000,
secret: "FAUNA_SECRET",
cursor: undefined,
start_ts: undefined,
};
client.feed(fql`Product.all().eventSource()`, options);
For supported properties, see FeedClientConfiguration in the API reference.
Event Streaming
The driver supports Event Streaming.
Start a stream
To get an event source, append
set.eventSource()
or
set.eventsOn()
to a
supported Set.
To stream the source’s events, pass the event source to stream()
:
const response = await client.query(fql`
let set = Product.all()
{
initialPage: set.pageSize(10),
eventSource: set.eventSource()
}
`);
const { initialPage, eventSource } = response.data;
client.stream(eventSource)
You can also pass a query that produces an event source directly to
stream()
:
const query = fql`Product.all().eventsOn(.price, .stock)`
client.stream(query)
Iterate on a stream
You can iterate on the stream using an async loop:
try {
for await (const event of stream) {
switch (event.type) {
case "update":
case "add":
case "remove":
console.log("Stream event:", event);
// ...
break;
}
}
} catch (error) {
// An error will be handled here if Fauna returns a terminal, "error" event, or
// if Fauna returns a non-200 response when trying to connect, or
// if the max number of retries on network errors is reached.
// ... handle fatal error
}
Or you can use a callback function:
stream.start(
function onEvent(event) {
switch (event.type) {
case "update":
case "add":
case "remove":
console.log("Stream event:", event);
// ...
break;
}
},
function onFatalError(error) {
// An error will be handled here if Fauna returns a terminal, "error" event, or
// if Fauna returns a non-200 response when trying to connect, or
// if the max number of retries on network errors is reached.
// ... handle fatal error
}
);
Close a stream
Use close()
to close a stream:
const stream = await client.stream(fql`Product.all().eventSource()`)
let count = 0;
for await (const event of stream) {
console.log("Stream event:", event);
// ...
count++;
// Close the stream after 2 events
if (count === 2) {
stream.close()
break;
}
}
Stream options
The Client configuration sets default options for the
stream()
method.
You can pass a StreamClientConfiguration
object to override these defaults:
const options = {
long_type: "number",
max_attempts: 5,
max_backoff: 1000,
secret: "FAUNA_SECRET",
status_events: true,
};
client.stream(fql`Product.all().eventSource()`, options)
For supported properties, see StreamClientConfiguration in the API reference.
Sample app
For a practical example that uses the JavaScript driver with Event Streams, check out the Event Streaming sample app.
Debug logging
To enable or disable debug logging, set the FAUNA_DEBUG
environment variable
to a string-encoded
LOG_LEVELS
integer:
# Enable logging for warnings (3) and above:
export FAUNA_DEBUG="3"
Logs are output to console
methods. If FAUNA_DEBUG
is not set or
is invalid, logging is disabled.
For advanced logging, you can pass a custom log handler using the client
configuration's logger
property:
import { Client, LOG_LEVELS } from "fauna";
import { CustomLogHandler } from "./your-logging-module";
// Create a client with a custom logger.
const client = new Client({
logger: new CustomLogHandler(LOG_LEVELS.DEBUG),
});
Is this article helpful?
Tell Fauna how the article can be improved:
Visit Fauna's forums
or email docs@fauna.com
Thank you for your feedback!