Event Feeds and Event Streaming reference
An event stream emits an event whenever tracked changes are made to a database. Applications can consume the events in two ways:
-
Event Feeds : Asynchronous requests that poll the stream for paginated events.
-
Event Streaming: A real-time subscription that pushes events from the stream to your application using an open connection to Fauna.
Use cases
Event Feeds and Event Streaming are useful for building features that need to react to data changes, such as:
-
Change data capture (CDC)
-
Real-time dashboards
-
Chat apps
-
Pub/sub integration
-
Multiplayer games
Create an event stream
To create an event stream, call
set.toStream()
or
set.changesOn()
on a
supported Set in an FQL query:
-
set.toStream()
tracks events for any change to any document in the Set:Product.all().toStream()
-
set.changesOn()
accepts a list of document fields. It tracks events for changes to the specified fields for documents in the Set:Product.sortedByPriceLowToHigh().changesOn(.price)
set.toStream()
and
set.changesOn()
return a stream
token:
{
"data":"g9WD1YPG...", // Stream token
"static_type":"Stream<Product>",
"summary":"",
"txn_ts":1718340750415639,
"stats": {
"compute_ops": 1,
"read_ops": 0,
"write_ops": 0,
"query_time_ms": 30,
"contention_retries": 0,
"storage_bytes_read": 0,
"storage_bytes_write": 0,
"rate_limits_hit": []
},
"schema_version":1718034700060000
}
You can use the stream token to consume the stream and its events. The stream does not start until first consume it.
Consume an event stream
Applications typically consume event streams using a Fauna client driver. The drivers can consume a stream using an asynchronous Event Feed or a real-time Event Streaming subscription.
The drivers provide methods for defining, consuming, and iterating through streams without directly handling stream tokens.
Event Feeds
To use Event Feeds, you must have a Pro or Enterprise plan.
The following Fauna client drivers support Event Feeds:
Follow the links for driver-specific documentation and examples.
How Event Feeds work
To start an event stream and request an Event Feed for the stream, the client driver sends a request containing a stream token to the Event Feed HTTP API endpoint.
When you first poll an event stream using an Event Feed, you usually specify a
start_ts
(start timestamp). start_ts
is an integer representing a time in
microseconds since the Unix epoch. The request returns events that occurred
after the specified timestamp (exclusive).
page_size
limits the number of events returned per page:
curl -X POST \
'https://db.fauna.com/changefeed/1' \
-H 'Authorization: Bearer $FAUNA_SECRET' \
-H 'Content-Type: application/json' \
-d '{
"token": "<STREAM_TOKEN>",
"start_ts": 1710968002310000
"page_size": 10
}'
The response includes an array of events for the stream:
{
"events": [
{
"type": "update",
"data": {
"@doc": {
"id": "<DOCUMENT_ID>",
"coll": {
"@mod": "Product"
},
"ts": {
"@time": "2099-09-04T21:14:29.970Z"
},
"name": "cups",
"description": "Translucent 9 Oz, 100 ct",
...
}
},
"txn_ts": 1725484469970000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 1,
"storage_bytes_read": 320,
"compute_ops": 1,
"processing_time_ms": 1,
"rate_limits_hit": []
}
},
...
],
"cursor": "gsGabc456", // Top-level cursor
"has_next": true,
"stats": {
"read_ops": 9,
"storage_bytes_read": 886,
"compute_ops": 1,
"processing_time_ms": 8,
"rate_limits_hit": []
}
}
If the response’s has_next
property is true
, the response includes a
top-level cursor
property. The client driver can use this cursor to get the
next page of events:
curl -X POST \
'https://db.fauna.com/changefeed/1' \
-H 'Authorization: Bearer $FAUNA_SECRET' \
-H 'Content-Type: application/json' \
-d '{
"token": "<STREAM_TOKEN>",
"cursor": "gsGabc456",
"page_size": 10
}'
Response:
{
"events": [
{
"type": "update",
"data": {
"@doc": {
"id": "<DOCUMENT_ID>",
"coll": {
"@mod": "Product"
},
"ts": {
"@time": "2099-09-04T21:14:29.970Z"
},
"name": "clear cups",
"description": "Translucent 9 Oz, 100 ct",
...
}
},
"txn_ts": 1725484469970000,
"cursor": "gsGabc456",
"stats": {
"read_ops": 1,
"storage_bytes_read": 320,
"compute_ops": 1,
"processing_time_ms": 1,
"rate_limits_hit": []
}
},
...
],
"cursor": "gsGabc789",
"has_next": true,
"stats": {
"read_ops": 9,
"storage_bytes_read": 886,
"compute_ops": 1,
"processing_time_ms": 8,
"rate_limits_hit": []
}
}
Get events after a specific start time
To get events after a specific time, the client driver uses the start_ts
request body parameter:
curl -X POST \
'https://db.fauna.com/changefeed/1' \
-H 'Authorization: Bearer $FAUNA_SECRET' \
-H 'Content-Type: application/json' \
-d '{
"token": "<STREAM_TOKEN>",
"page_size": 10,
"start_ts": 1710968002310000
}'
start_ts
must be later than the creation time of the stream token. The period
between the request and the start_ts
can’t exceed the history_days
setting
for the source Set’s collection. If history_days
is 0
or unset, the period
is limited to 15 minutes.
Get events after a specific event cursor
To get events from a previous event’s cursor, the client driver uses
the cursor
request body parameter. The stream will replay events that occurred
after the cursor (exclusive):
curl -X POST \
'https://db.fauna.com/changefeed/1' \
-H 'Authorization: Bearer $FAUNA_SECRET' \
-H 'Content-Type: application/json' \
-d '{
"token": "<STREAM_TOKEN>",
"cursor": "gsGabc456",
"page_size": 10
}'
The period between the request and the cursor
event’s txn_ts
(transaction
timestamp) can’t exceed the history_days
setting for the source Set’s
collection. If history_days
is 0
or unset, the period is limited to 15
minutes.
Event Streaming
The following Fauna client drivers support real-time Event Streaming:
Follow the links for driver-specific documentation and examples.
How Event Streaming works
To start and subscribe to an event stream’s events in real time, the client driver sends a request containing the stream token to the Stream HTTP API endpoint:
curl -X POST \
'https://db.fauna.com/stream/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<STREAM_TOKEN>"
}'
In response, the stream emits a status
event, indicating the subscription has
started.
{
"type": "status",
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
The
Stream
API request's connection remains open. If a tracked change occurs, the stream
emits a related add
, remove
, or update
event. These events include the
triggering document, encoded using the
tagged format, in the
data
field:
{
"type": "update",
"data": {
"@doc": {
"id": "392914348360597540",
"coll": { "@mod": "Product" },
"ts": { "@time": "2099-03-21T12:35:18.680Z" },
"name": "pizza",
"description": "Frozen Cheese",
...
}
},
"txn_ts": 1711024518680000,
"cursor": "gsGdef456",
"stats": {
...
}
}
If a change occurs between the creation of the stream token and the start of a stream, the stream replays and emits the related events when the stream starts.
Subscription disconnection
Fauna’s client drivers can detect connection loss and automatically reconnect disconnected Event Streaming subscriptions. Events that occur during network issues are replayed and emitted when the subscription reconnects.
When a subscription reconnects, the stream emits a new status
event:
{
"type": "status",
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Restart an Event Streaming subscription
The Stream HTTP API endpoint supports two methods for restarting disconnected stream subscriptions:
The methods are mutually exclusive and can’t be used together.
Restart from an event cursor
To restart a stream from a previous event’s cursor, the client driver
uses the cursor
request body parameter. The restarted stream will replay
events that occurred after the cursor (exclusive):
curl -X POST \
'https://db.fauna.com/stream/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<STREAM_TOKEN>",
"cursor": "gsGabc123"
}'
The period between the subscription restart and the cursor
event’s txn_ts
(transaction timestamp) can’t exceed the history_days
setting for the source
Set’s collection. If history_days
is 0
or unset, the period is limited to 15
minutes.
Restart from a transaction timestamp
To restart a subscription from a transaction timestamp, the client driver uses
the start_ts
request body parameter. start_ts
is an integer representing the
subscription start time in microseconds since the Unix epoch:
curl -X POST \
'https://db.fauna.com/stream/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<STREAM_TOKEN>",
"start_ts": 1710968002310000
}'
start_ts
must be later than the creation time of the stream token. The period
between the subscription restart and the start_ts
can’t exceed the
history_days
setting for the source Set’s collection. If history_days
is 0
or unset, the period is limited to 15 minutes.
Event Streaming sample app
The Event Streaming sample app show how you can use Event Streaming to build a real-time chat app. You can use it as a starting point for your own app.
See Event Streaming sample app |
---|
Permission changes
If the authentication secret used to start a subscription is revoked or the secret’s privileges change, the subscription using the token closes due to permission loss. This applies even if the secret still has access to the documents the stream is tracking.
Supported sources
You can only create a stream on a Set from a supported source. The source
affects the exact behavior of
set.toStream()
or
set.changesOn()
.
Supported source | Behavior |
---|---|
User-defined collection |
You can’t create a stream on a system collection. |
User-defined index |
You can’t create a stream on an index for a system collection. |
|
Collection streams
Calling set.toStream()
directly on
collection.all()
tracks any
change to any document in the collection.
The following query tracks any change to documents in the Product
collection:
Product.all().toStream()
For example, if you change a Product
document’s price
to below 100_00
,
the stream emits an update
event.
You can use
collection.where()
to
filter the tracked documents for a collection.
For example, the following query only tracks
Product
documents with a price
of less than 100_00
.
Product.where(.price < 100_00).toStream()
If you change a Product
document’s price
from above 100_00
to below
100_00
, the stream emits an add
event. Before the change, the
document would not have been part of the stream’s Set.
You can use set.changesOn()
to only
track changes to specific fields.
The following query tracks changes made to any Product
document’s
description
. The stream doesn’t emit events for changes to other fields.
Product.all().changesOn(.description)
Index streams
Index streams only emit events for changes to the index’s terms
or values
fields.
For example, the following Product
collection’s byCategory()
index has:
-
A term field of
category
-
Value fields of
name
andprice
collection Product {
*: Any
index byCategory {
terms [.category]
values [.name, .price]
}
...
}
The following query only tracks changes to the category
, name
, or price
fields for Product
documents with a category
of produce
.
let produce = Category.byName("produce").first()
Product.byCategory(produce).toStream()
When called on an index,
set.changesOn()
only accepts the
index’s terms
or values
fields as arguments.
For example, in the following query,
set.changesOn()
only accepts
.category
, .name
, or .price
as arguments.
let produce = Category.byName("produce").first()
Product.byCategory(produce).changesOn(.category, .name)
Document streams
You can use streams to track changes to a Set containing a single document. These streams only emit events when the document changes.
Use Set.single()
to create
a Set from a document.
let product = Product.byId(111)!
Set.single(product).toStream()
Use set.changesOn()
to only
track changes to specific fields of the document.
let product = Product.byId(111)!
Set.single(product).changesOn(.name, .price)
Resource deletion
If the database or source for a stream is deleted, the stream won’t emit any further events. Subscriptions for the stream don’t automatically close.
Supported transformations and filters
Streams only support source Sets that are transformed or filtered using:
This ensures Fauna can convert the Set to a stream. Sets using unsupported transformations or filters will fail to convert.
For example, the Set for the following stream uses the unsupported
set.drop()
method.
Product.all().drop(10).toStream()
Running the query returns the following error:
invalid_receiver: can't call `.toStream()` because streaming is not supported on sets returned from `.drop()`.
error: can't call `.toStream()` because streaming is not supported on sets returned from `.drop()`.
at *query*:1:32
|
1 | Product.all().drop(10).toStream()
| ^^
|
Filters
Use set.where()
to filter a stream’s
source Set.
For example, the following query only tracks changes to Product
documents
with:
-
A
category
ofproduce
-
A
price
less than100_00
let produce = Category.byName("produce").first()
Product
.all()
.where(.category == produce)
.where(.price < 100_00)
.toStream()
You can also call set.where()
directly on set.toStream()
or
set.changesOn()
. The following query
is equivalent to the previous one.
let produce = Category.byName("produce").first()
Product
.all()
.toStream()
.where(.category == produce)
.where(.price < 100_00)
set.where()
produces a new
Set based on its criteria. The criteria affect the event types
emitted for changes:
-
Creating a document in the Set produces an
add
event. -
Updating a document so that it moves into the Set produces an
add
event. -
Updating a document so that it remains in the Set produces an
update
event. -
Updating a document so that it moves out of the Set produces a
remove
event. -
Deleting a document from the Set produces a
remove
event. -
Any other changes produce no events.
While filters affect events emitted for a stream, they don’t affect event processing, which impacts performance and cost. See How filters affect costs and performance.
Projection
A stream’s add
and update
event types include a data
field. This field
contains the document that triggered the event.
Use set.map()
or
projection to return only specific document
fields in these events.
For example, the following query tracks changes to any field in any Product
document. The query uses set.map()
to only include the name
and price
document fields in the data
field of
add
and update
events.
Product
.all()
.map(product => {
name: product.name,
price: product.price
})
.toStream()
The following query uses projection and is equivalent to the previous one.
let products = Product.all() { name, price }
products.toStream()
The previous queries can produce the following add
event. The event’s data
field includes only the name
and price
document fields.
{
"type": "add",
"data": { "name": "pizza", "price": "1599" },
"txn_ts": 1711028312060000,
"cursor": "gsGghu789",
"stats": {
"read_ops": 1,
"storage_bytes_read": 69,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Events
Streams emit one event per document per transaction.
Event order
Events are ordered by ascending txn_ts
(transaction timestamp). Events from
the same transaction share the same txn_ts
, but their order may differ across
clients.
Event types
The following table outlines supported event types.
Event type | Sent when … |
---|---|
|
A document is added to the Set. |
|
A document is removed from the Set. Event streams don’t emit |
|
A document in the Set changes. |
|
A stream starts or reconnects. It’s also sent periodically to:
See the status event schema. Event Feeds don’t receive or include status events. |
A stream can no longer be consumed due to an error. See the error event schema and Error codes. |
Event schema
Events with a type other than status
or error
have the following schema:
{
"type": "add",
"data": {
"@doc": {
"id": "392914348360597540",
"coll": { "@mod": "Product" },
"ts": { "@time": "2099-03-20T21:46:12.580Z" },
"foo": "bar"
}
},
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
status
event types have the following schema:
{
"type": "status",
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 0,
"storage_bytes_read": 0,
"compute_ops": 0,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
error
event types have the following schema:
{
"type": "error",
"error": {
"code": "invalid_stream_start_time",
"message": "Stream start time 2099-09-05T14:27:10.100Z is too far in the past. Recreate the stream and try again."
},
"stats": {
"read_ops": 0,
"storage_bytes_read": 0,
"compute_ops": 0,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Field name | Type | Description | ||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
string |
Event type: Event Feeds don’t receive or include status events. |
||||||||||||||||||
|
object |
Document that triggered the event. FQL values are encoded using the tagged format. The |
||||||||||||||||||
|
object |
Contains an error for the stream. Only
|
||||||||||||||||||
|
integer |
The related transaction’s commit time in microseconds since the Unix epoch. The |
||||||||||||||||||
string |
Cursor for the event. If a stream disconnects, the Fauna HTTP API and client drivers can use the cursor to replay events that occurred after the cursor. See Restart from an event cursor. The |
|||||||||||||||||||
object |
Event statistics.
|
Error codes
The following table outlines possible error codes for error
events.
Error code | Cause |
---|---|
|
An internal error caused by Fauna. |
|
The requested stream start time is too far in the past. The collection containing the stream’s document Set doesn’t retain enough history to replay requested events. See Restart a stream. |
|
The authentication secret used to create the stream token was revoked or the secret’s privileges changed. See Permission changes. |
|
The stream attempts to process more than 128 events at once, exceeding the event limit. |
|
The stream would replay more than 128 events at once, exceeding the event limit. |
Costs and performance
A stream’s cost and performance are closely related to its shape. A stream’s shape is defined by:
-
The source Set
-
Transformations and filters applied to the source Set
Processing and sending events for streams consume Transactional Read Operations (TROs) and Transactional Compute Operations (TCOs).
The exact number of TROs and TCOs consumed varies based on the stream’s shape. See Event Streaming subscriptions in the billing docs.
Depending on its cardinality and throughput, consuming a stream for a large Set may cause delays in event delivery and consume more operations.
If a stream replays events, it may also consume additional operations.
Each stream event includes statistics that report consumed operations. If you exceed your Fauna’s or your plan’s operations limit, Fauna closes the stream with an error event.
How filters affect costs and performance
Streams may discard events based on filters.
For example, a stream with the following query uses a filter to only emit
events for Product
documents with a category
of produce
:
let produce = Category.byName("produce").first()
Product
.all()
.where(.category == produce)
.toStream()
To do this, Fauna processes an event for any change to any Product
document. It then discards events for documents without a category
of
produce
. These discarded events still consume operations for your account.
To track changes on a large Set, we recommend using an index stream.
For example, the following stream emits events similar to the previous one.
However, it only tracks the index’s terms
and values
fields:
let produce = Category.byName("produce").first()
Product
.byCategory(produce)
.toStream()
Another source of discarded events is privilege predicates in roles. For
example, the following role uses predicates to grant its members read and
write access only to Product
documents with a category
of produce
:
role ProduceManager {
privileges Product {
write {
predicate ((product, _) => product?.category?.name == "produce")
}
read {
predicate (product => product?.category?.name == "produce")
}
}
}
A stream token created using an authentication secret with this role only emits events for documents the role can access. Other events are discarded. These discarded events still consume operations for your account.
Limitations
-
Operation limits apply to event streams.
-
While processing events, Fauna runs one query per transaction.
-
A stream can’t replay or process more than 128 events at a time. If a stream has more than 128 events to process, Fauna closes the stream with an error event.
-
You can’t create streams for:
-
An index for a system collection.
-
A set that combines documents from one or more collections.
Is this article helpful?
Tell Fauna how the article can be improved:
Visit Fauna's forums
or email docs@fauna.com
Thank you for your feedback!