Event Feeds and Event Streams
A Fauna event source emits an event whenever tracked changes are made to a database. Applications can consume the events in two ways:
-
Event Feeds : Asynchronous requests that poll the event source for paginated events.
-
Event Streams: A real-time subscription that pushes events from the event source to your application using an open connection to Fauna.
Use cases
Event Feeds and Event Streams are useful for building features that need to react to data changes, such as:
-
Change data capture (CDC)
-
Real-time dashboards
-
Chat apps
-
Pub/sub integration
-
Multiplayer games
Create an event source
To create an event source, call
set.eventSource()
or
set.eventsOn()
on a
supported Set in an FQL query:
-
set.eventSource()
tracks all documents in the Set:Product.all().eventSource()
-
set.eventsOn()
tracks changes to specified document fields in the Set:Product.sortedByPriceLowToHigh().eventsOn(.price)
Event source tokens
set.eventSource()
and
set.eventsOn()
return a
string-encoded token that represents the event source. The token has the
EventSource type:
"g9WD1YPG..."
You can use the token to consume the event source as an Event Feed or a Event Stream.
Applications typically create feeds and streams using a Fauna client driver. The drivers provide methods for creating feeds and streams without directly handling event source tokens.
Event source token composition
The event source token is a hash that includes:
-
The event source query. The query determines the events returned in Event Feeds and Event Streams that consume the source.
-
The snapshot timestamp for the query that created the event source. This timestamp is the default start time for Event Feeds or Event Streams that consume the source.
Event tokens are generated deterministically but are not idempotent. Running the same event source query multiple times produces different event source tokens.
Event Feeds
To use Event Feeds, you must have a Pro or Enterprise plan.
The following Fauna client drivers support Event Feeds:
Example
With the JavaScript driver, use feed()
to define an event source and return the event source’s paginated events.
To get the first page of events, you typically specify a start_ts
(start
timestamp) in the initial feed()
request.
Each page of events includes a top-level cursor
. In subsequent requests, you
can pass this cursor
instead of a start_ts
to feed()
. This polls
for events after the cursor (exclusive):
import { Client, fql } from "fauna";
const client = new Client();
async function processFeed(client, query, startTs = null, sleepTime = 300) {
let cursor = null;
while (true) {
// Only include `start_ts `if `cursor` is null. Otherwise, only include `cursor`.
const options = cursor === null ? { start_ts: startTs } : { cursor: cursor };
const feed = client.feed(query, options);
for await (const page of feed) {
for (const event of page.events) {
switch (event.type) {
case "add":
console.log("Add event: ", event);
break;
case "update":
console.log("Upodate event: ", event);
break;
case "remove":
console.log("Remove event: ", event);
break;
}
}
// Store the cursor of the last page
cursor = page.cursor;
}
// Clear startTs after the first request
startTs = null;
console.log(`Sleeping for ${sleepTime} seconds...`);
await new Promise(resolve => setTimeout(resolve, sleepTime * 1000));
}
}
const query = fql`Product.all().eventsOn(.price, .stock)`;
// Calculate timestamp for 10 minutes ago
const tenMinutesAgo = new Date(Date.now() - 10 * 60 * 1000);
const startTs = Math.floor(tenMinutesAgo.getTime() / 1000) * 1000000;
processFeed(client, query, startTs);
If needed, you can store the cursor as a collection document. For an example, see the Event Feeds sample app.
Event Feeds sample app
The Event Feeds sample app shows how you can use Event Feeds to track changes to a database. The app uses an AWS Lambda function to send events for related changes to another service.
See Event Feeds sample app |
---|
How Event Feeds work
To request an Event Feed for an event source, the client driver sends a request containing an event source token to the Event Feed HTTP API endpoint.
When you first poll an event source using an Event Feed, you usually specify a
start_ts
(start timestamp). start_ts
is an integer representing a time in
microseconds since the Unix epoch. The request returns events that occurred
after the specified timestamp (exclusive).
page_size
limits the number of events returned per page:
curl -X POST \
'https://db.fauna.com/feed/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"start_ts": 1710968002310000
"page_size": 10
}'
The response includes an array of events for the event source:
{
"events": [
{
"type": "update",
"data": {
"@doc": {
"id": "111",
"coll": {
"@mod": "Product"
},
"ts": {
"@time": "2099-09-04T21:14:29.970Z"
},
"name": "cups",
"description": "Translucent 9 Oz, 100 ct",
...
}
},
"txn_ts": 1725484469970000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 1,
"storage_bytes_read": 320,
"compute_ops": 1,
"processing_time_ms": 1,
"rate_limits_hit": []
}
},
...
],
"cursor": "gsGabc456", // Top-level cursor
"has_next": true,
"stats": {
"read_ops": 9,
"storage_bytes_read": 886,
"compute_ops": 1,
"processing_time_ms": 8,
"rate_limits_hit": []
}
}
If the response’s has_next
property is true
, the response includes a
top-level cursor
property. The client driver can use this cursor to get the
next page of events:
curl -X POST \
'https://db.fauna.com/feed/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"cursor": "gsGabc456",
"page_size": 10
}'
Response:
{
"events": [
{
"type": "update",
"data": {
"@doc": {
"id": "111",
"coll": {
"@mod": "Product"
},
"ts": {
"@time": "2099-09-04T21:14:29.970Z"
},
"name": "clear cups",
"description": "Translucent 9 Oz, 100 ct",
...
}
},
"txn_ts": 1725484469970000,
"cursor": "gsGabc456",
"stats": {
"read_ops": 1,
"storage_bytes_read": 320,
"compute_ops": 1,
"processing_time_ms": 1,
"rate_limits_hit": []
}
},
...
],
"cursor": "gsGabc789",
"has_next": true,
"stats": {
"read_ops": 9,
"storage_bytes_read": 886,
"compute_ops": 1,
"processing_time_ms": 8,
"rate_limits_hit": []
}
}
You can reuse cursors across event sources with identical queries in the same database.
Get events after a specific start time
To get events after a specific time, the client driver uses the start_ts
request body parameter:
curl -X POST \
'https://db.fauna.com/feed/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"page_size": 10,
"start_ts": 1710968002310000
}'
The period between the request and the start_ts
can’t exceed the
history_days
setting for the source Set’s collection. If history_days
is 0
or unset, the period is limited to 15 minutes. Requests that use a start_ts
older than this period return an error
event with the
invalid_start_time
error code.
Get events after a specific cursor
To get events from a previous event’s cursor, the client driver uses
the cursor
request body parameter. The event source will replay events that
occurred after the cursor (exclusive):
curl -X POST \
'https://db.fauna.com/feed/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"cursor": "gsGabc456",
"page_size": 10
}'
The period between the request and the cursor
event’s txn_ts
(transaction
timestamp) can’t exceed the history_days
setting for the source
Set's collection. If history_days
is 0
or unset, the period is limited to
15 minutes. Requests that use a cursor older than this period return an
error
event with the invalid_start_time
error
code.
You can reuse cursors across event sources with identical queries in the same database.
Default start time
If an Event Feed request doesn’t specify a start_ts
(start
timestamp) or cursor
, the request’s start_ts
defaults to
the event source query’s timestamp.
If the timestamp is outside the
history retention period of the
source Set's collection, the request returns an error
event with the invalid_start_time
error code.
Event Streams
The following Fauna client drivers support real-time Event Streams:
Example
With the JavaScript driver, you use the
stream()
function to define and subscribe to an event source in real time:
import { Client, fql } from "fauna";
const client = new Client();
const query = fql`Product.where(.type == 'book' && .price < 100_00).eventSource()`;
const stream = client.stream(query);
try {
for await (const event of stream) {
switch (event.type) {
case "add":
// Do something on add
console.log(event.data);
break;
case "update":
// Do something on update
console.log(event.data);
break;
case "remove":
// Do something on remove
console.log(event.data);
break;
}
}
} catch (error) {
console.log(error);
}
You can also pass an event source token to stream()
. This lets you
get query results alongside the stream:
import { Client, fql } from "fauna";
const client = new Client();
const query = fql`
let products = Product.where( .type == 'book' && .price < 100_00)
{
products: products,
eventSource: products.eventSource()
}`;
const response = await client.query(query);
const { products, eventSource } = response.data;
for await (const product of client.paginate(products)) {
console.log(product);
}
const stream = client.stream(eventSource);
try {
for await (const event of stream) {
switch (event.type) {
case "add":
// Do something on add
console.log(event.data);
break;
case "update":
// Do something on update
console.log(event.data);
break;
case "remove":
// Do something on remove
console.log(event.data);
break;
}
}
} catch (error) {
console.log(error);
}
Event Streams sample app
The Event Streams sample app shows how you can use Event Streams to build a real-time chat app. You can use it as a starting point for your own app.
See Event Streaming sample app |
---|
How Event Streams work
To subscribe to an event source’s events in real time, the client driver sends a request containing the event source token to the Event Stream HTTP API endpoint:
curl -X POST \
'https://db.fauna.com/stream/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>"
}'
In response, the event source emits a status
event, indicating the
stream has started.
{
"type": "status",
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
The Event Stream
API request's connection remains open. If a tracked change occurs, the event
source emits a related add
, remove
, or update
event. These events include
the triggering document, encoded using the
tagged format, in the
data
field:
{
"type": "update",
"data": {
"@doc": {
"id": "392914348360597540",
"coll": { "@mod": "Product" },
"ts": { "@time": "2099-03-21T12:35:18.680Z" },
"name": "pizza",
"description": "Frozen Cheese",
...
}
},
"txn_ts": 1711024518680000,
"cursor": "gsGdef456",
"stats": {
...
}
}
If a change occurs between the creation of the event source and the start of a stream, the stream replays and emits the related events.
Default start time
If an Event Stream request doesn’t specify a start_ts
(start timestamp) or cursor
, start_ts
defaults to the
event source query’s timestamp.
If the timestamp is outside the
history retention period of the
source Set's collection, the stream returns an error
event
with the invalid_start_time
error code.
Stream disconnection
Fauna’s client drivers can detect connection loss and automatically reconnect disconnected Event Stream. Events that occur during network issues are replayed and emitted when the stream reconnects.
When a stream reconnects, the event source emits a new status
event:
{
"type": "status",
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Restart an Event Stream
The Event Stream HTTP API endpoint supports two methods for restarting disconnected streams:
The methods are mutually exclusive and can’t be used together.
Restart from an event cursor
To restart a stream from a previous event’s cursor, the client driver
uses the cursor
request body parameter. The restarted stream will replay
events that occurred after the cursor (exclusive):
curl -X POST \
'https://db.fauna.com/stream/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"cursor": "gsGabc123"
}'
The period between the stream restart and the cursor
event’s txn_ts
(transaction timestamp) can’t exceed the history_days
setting for the source
Set’s collection. If history_days
is 0
or unset, the period is limited to 15
minutes. Requests that use a cursor older than this period return an
error
event with the invalid_start_time
error
code.
Restart from a transaction timestamp
To restart a stream from a transaction timestamp, the client driver uses
the start_ts
request body parameter. start_ts
is an integer representing the
stream start time in microseconds since the Unix epoch:
curl -X POST \
'https://db.fauna.com/stream/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"start_ts": 1710968002310000
}'
The period
between the stream restart and the start_ts
can’t exceed the history_days
setting for the source Set’s collection. If history_days
is 0
or unset, the
period is limited to 15 minutes. Requests that use a start_ts
older than this
period return an error
event with the invalid_start_time
error code.
For Event Streams, start_ts
must be after the event source
query’s timestamp.
Event Streams sample app
The Event Streams sample app show how you can use Event Streams to build a real-time chat app. You can use it as a starting point for your own app.
See Event Streaming sample app |
---|
Permission changes
If the authentication secret used to create an event source is revoked or the secret’s privileges change, the stream consuming the event source closes due to permission loss. This applies even if the secret still has access to the documents the event source is tracking.
Supported Sets
You can only create an event source on a supported Set. The Set can only contain documents from a user-defined collection.
The source’s Set affects the exact behavior of
set.eventSource()
or
set.eventsOn()
.
Supported Set | Behavior |
---|---|
User-defined collection |
You can’t create an event source on a system collection. |
User-defined index |
You can’t create an event source on an index for a system collection. |
|
Collection event sources
Calling set.eventSource()
directly
on collection.all()
tracks
any change to any document in the collection.
The following query tracks any change to documents in the Product
collection:
Product.all().eventSource()
For example, if you change a Product
document’s price
to below 100_00
,
the event source emits an update
event.
You can use
collection.where()
to
filter the tracked documents for a collection.
For example, the following query only tracks
Product
documents with a price
of less than 100_00
.
Product.where(.price < 100_00).eventSource()
If you change a Product
document’s price
from above 100_00
to below
100_00
, the event source emits an add
event. Before the change, the
document would not have been part of the event source’s Set.
You can use set.eventsOn()
to only
track changes to specific fields.
The following query tracks changes made to any Product
document’s
description
. The event source doesn’t emit events for changes to other fields.
Product.all().eventsOn(.description)
Index event sources
Index event sources only emit events for changes to the index’s terms
or
values
fields.
For example, the following Product
collection’s byCategory()
index has:
-
A term field of
category
-
Value fields of
name
andprice
collection Product {
*: Any
index byCategory {
terms [.category]
values [.name, .price]
}
...
}
The following query only tracks changes to the category
, name
, or price
fields for Product
documents with a category
of produce
.
let produce = Category.byName("produce").first()
Product.byCategory(produce).eventSource()
When called on an index,
set.eventsOn()
only accepts the
index’s terms
or values
fields as arguments.
For example, in the following query,
set.eventsOn()
only accepts
.category
, .name
, or .price
as arguments.
let produce = Category.byName("produce").first()
Product.byCategory(produce).eventsOn(.category, .name)
Document event sources
You can use event sources to track changes to a Set containing a single document. These event sources only emit events when the document changes.
Use Set.single()
to create
a Set from a document.
let product = Product.byId(111)!
Set.single(product).eventSource()
Use set.eventsOn()
to only
track changes to specific fields of the document.
let product = Product.byId(111)!
Set.single(product).eventsOn(.name, .price)
Resource deletion
If the database or source for an event source is deleted, the event source won’t emit any further events. Event Streams for the event source don’t automatically close.
Supported transformations and filters
Event sources only support source Sets that are transformed or filtered using:
This ensures Fauna can convert the Set to an event source. Sets using unsupported transformations or filters will fail to convert.
For example, the Set for the following event source uses the unsupported
set.drop()
method.
Product.all().drop(10).eventSource()
Running the query returns the following error:
invalid_receiver: can't call `.eventSource()` because streaming is not supported on sets returned from `.drop()`.
error: can't call `.eventSource()` because streaming is not supported on sets returned from `.drop()`.
at *query*:1:35
|
1 | Product.all().drop(10).eventSource()
| ^^
|
Filters
Use set.where()
to filter an event
source’s Set.
For example, the following query only tracks changes to Product
documents
with:
-
A
category
ofproduce
-
A
price
less than100_00
let produce = Category.byName("produce").first()
Product
.all()
.where(.category == produce)
.where(.price < 100_00)
.eventSource()
You can also call set.where()
directly on set.eventSource()
or
set.eventsOn()
. The following query
is equivalent to the previous one.
let produce = Category.byName("produce").first()
Product
.all()
.eventSource()
.where(.category == produce)
.where(.price < 100_00)
set.where()
produces a new
Set based on its criteria. The criteria affect the event types
emitted for changes:
-
Creating a document in the Set produces an
add
event. -
Updating a document so that it moves into the Set produces an
add
event. -
Updating a document so that it remains in the Set produces an
update
event. -
Updating a document so that it moves out of the Set produces a
remove
event. -
Deleting a document from the Set produces a
remove
event. -
Any other changes produce no events.
While filters affect events emitted for an event source, they don’t affect event processing, which impacts performance and cost. See How filters affect costs and performance.
Projection
An event source’s add
and update
event types include a data
field. This
field contains the document that triggered the event.
Use set.map()
or
projection to return only specific document
fields in these events.
For example, the following query tracks changes to any field in any Product
document. The query uses set.map()
to only include the name
and price
document fields in the data
field of
add
and update
events.
Product
.all()
.map(product => {
name: product.name,
price: product.price
})
.eventSource()
The following query uses projection and is equivalent to the previous one.
let products = Product.all() { name, price }
products.eventSource()
The previous queries can produce the following add
event. The event’s data
field includes only the name
and price
document fields.
{
"type": "add",
"data": { "name": "pizza", "price": "1599" },
"txn_ts": 1711028312060000,
"cursor": "gsGghu789",
"stats": {
"read_ops": 1,
"storage_bytes_read": 69,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Events
Event sources emit one event per document per transaction.
Event order
Events are ordered by ascending txn_ts
(transaction timestamp). Events from
the same transaction share the same txn_ts
, but their order may differ
in Event Streams across clients. Event Feeds return events in the same
order across clients.
Event types
The following table outlines supported event types.
Event type | Sent when … |
---|---|
|
A document is added to the Set. |
|
A document is removed from the Set. Event sources don’t emit |
|
A document in the Set changes. |
|
An Event Stream starts or reconnects. Streams also periodically emit
See the status event schema. Event Feeds don’t receive or include status events. |
An event source can no longer be consumed due to an error. See the error event schema and Error codes. |
Event schema
Events with a type other than status
or error
have the following schema:
{
"type": "add",
"data": {
"@doc": {
"id": "392914348360597540",
"coll": { "@mod": "Product" },
"ts": { "@time": "2099-03-20T21:46:12.580Z" },
"foo": "bar"
}
},
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
status
event types have the following schema:
{
"type": "status",
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 0,
"storage_bytes_read": 0,
"compute_ops": 0,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
error
event types have the following schema:
{
"type": "error",
"error": {
"code": "invalid_stream_start_time",
"message": "Stream start time 2099-09-05T14:27:10.100Z is too far in the past. Recreate the stream and try again."
},
"stats": {
"read_ops": 0,
"storage_bytes_read": 0,
"compute_ops": 0,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Field name | Type | Description | ||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
string |
Event type: Event Feeds don’t receive or include status events. |
||||||||||||||||||
|
object |
Document that triggered the event. FQL values are encoded using the tagged format. The |
||||||||||||||||||
|
object |
Contains an error for the event source. Only
|
||||||||||||||||||
|
integer |
The related transaction’s commit time in microseconds since the Unix epoch. The |
||||||||||||||||||
string |
Cursor for the event. The Fauna HTTP API and client drivers can use the cursor to replay events that occurred after the cursor. See Restart from an event cursor. The |
|||||||||||||||||||
object |
Event statistics.
|
Error codes
The following table outlines possible error codes for error
events.
Error code | Cause |
---|---|
|
An internal error caused by Fauna. |
The requested cursor or start time is too far in the past. The collection containing the stream’s document Set doesn’t retain enough history to replay requested events. |
|
|
The authentication secret used to create the event source was revoked or the secret’s privileges changed. See Permission changes. |
|
The event source attempts to process more than 128 events at once, exceeding the event limit. |
|
The event source would replay more than 128 events at once, exceeding the event limit. |
Costs and performance
An event source’s cost and performance are closely related to its shape. An event source’s shape is defined by:
-
The source Set
-
Transformations and filters applied to the source Set
Processing and sending events consume Transactional Read Operations (TROs) and Transactional Compute Operations (TCOs).
The exact number of TROs and TCOs consumed varies based on the event source’s shape. See Event Streams in the billing docs.
Depending on its cardinality and throughput, consuming an event source for a large Set may cause delays in event delivery and consume more operations.
If an event source replays events, it may also consume additional operations.
Each event includes stats that report consumed operations. If you exceed your Fauna’s or your plan’s operations limit, the event source emits an error event. For Event Streams, this closes the stream.
How filters affect costs and performance
Event sources may discard events based on filters.
For example, an event source with the following query uses a filter to only emit
events for Product
documents with a category
of produce
:
let produce = Category.byName("produce").first()
Product
.all()
.where(.category == produce)
.eventSource()
To do this, Fauna processes an event for any change to any Product
document. It then discards events for documents without a category
of
produce
. These discarded events still consume operations for your account.
To track changes on a large Set, we recommend using an index event source.
For example, the following event source emits events similar to the previous
one. However, it only tracks the index’s terms
and values
fields:
let produce = Category.byName("produce").first()
Product
.byCategory(produce)
.eventSource()
Another source of discarded events is privilege predicates in roles. For
example, the following role uses predicates to grant its members read and
write access only to Product
documents with a category
of produce
:
role ProduceManager {
privileges Product {
write {
predicate ((product, _) => product?.category?.name == "produce")
}
read {
predicate (product => product?.category?.name == "produce")
}
}
}
An event source created using an authentication secret with this role only emits events for documents the role can access. Other events are discarded. These discarded events still consume operations for your account.
Limitations
-
Operation limits apply to event sources.
-
While processing events, Fauna runs one query per transaction.
-
An event source can’t replay or process more than 128 events at a time. If an event source has more than 128 events to process, Fauna closes the event source with an error event.
-
You can’t create event sources for:
-
An index for a system collection.
-
A set that combines documents from one or more collections.
Is this article helpful?
Tell Fauna how the article can be improved:
Visit Fauna's forums
or email docs@fauna.com
Thank you for your feedback!