Event Feeds and Event Streaming reference
An event source emits an event whenever tracked changes are made to a database. Applications can consume the events in two ways:
-
Event Feeds : Asynchronous requests that poll the event source for paginated events.
-
Event Streaming: A real-time subscription that pushes events from the event source to your application using an open connection to Fauna.
Use cases
Event Feeds and Event Streaming are useful for building features that need to react to data changes, such as:
-
Change data capture (CDC)
-
Real-time dashboards
-
Chat apps
-
Pub/sub integration
-
Multiplayer games
Create an event source
To create an event source, call
set.eventSource()
or
set.eventsOn()
on a
supported Set in an FQL query:
-
set.eventSource()
tracks events for any change to any document in the Set:Product.all().eventSource()
-
set.eventsOn()
accepts a list of document fields. It tracks events for changes to the specified fields for documents in the Set:Product.sortedByPriceLowToHigh().eventsOn(.price)
set.eventSource()
and
set.eventsOn()
return an event
source:
"g9WD1YPG..."
You can consume the event source as an asynchronous Event Feed or a real-time Event Stream. The event source does not start until first consume it.
Consume an event source
Applications typically consume event sources using a Fauna client driver. The drivers provide methods for defining, consuming, and iterating through an event source’s events without directly handling event sources.
Event Feeds
To use Event Feeds, you must have a Pro or Enterprise plan.
The following Fauna client drivers support Event Feeds:
Follow the links for driver-specific documentation and examples.
How Event Feeds work
To start an event source and request an Event Feed for the event source, the client driver sends a request containing an event source to the Event Feed HTTP API endpoint.
When you first poll an event source using an Event Feed, you usually specify a
start_ts
(start timestamp). start_ts
is an integer representing a time in
microseconds since the Unix epoch. The request returns events that occurred
after the specified timestamp (exclusive).
page_size
limits the number of events returned per page:
curl -X POST \
'https://db.fauna.com/feed/1' \
-H 'Authorization: Bearer $FAUNA_SECRET' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"start_ts": 1710968002310000
"page_size": 10
}'
The response includes an array of events for the event source:
{
"events": [
{
"type": "update",
"data": {
"@doc": {
"id": "111",
"coll": {
"@mod": "Product"
},
"ts": {
"@time": "2099-09-04T21:14:29.970Z"
},
"name": "cups",
"description": "Translucent 9 Oz, 100 ct",
...
}
},
"txn_ts": 1725484469970000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 1,
"storage_bytes_read": 320,
"compute_ops": 1,
"processing_time_ms": 1,
"rate_limits_hit": []
}
},
...
],
"cursor": "gsGabc456", // Top-level cursor
"has_next": true,
"stats": {
"read_ops": 9,
"storage_bytes_read": 886,
"compute_ops": 1,
"processing_time_ms": 8,
"rate_limits_hit": []
}
}
If the response’s has_next
property is true
, the response includes a
top-level cursor
property. The client driver can use this cursor to get the
next page of events:
curl -X POST \
'https://db.fauna.com/feed/1' \
-H 'Authorization: Bearer $FAUNA_SECRET' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"cursor": "gsGabc456",
"page_size": 10
}'
Response:
{
"events": [
{
"type": "update",
"data": {
"@doc": {
"id": "111",
"coll": {
"@mod": "Product"
},
"ts": {
"@time": "2099-09-04T21:14:29.970Z"
},
"name": "clear cups",
"description": "Translucent 9 Oz, 100 ct",
...
}
},
"txn_ts": 1725484469970000,
"cursor": "gsGabc456",
"stats": {
"read_ops": 1,
"storage_bytes_read": 320,
"compute_ops": 1,
"processing_time_ms": 1,
"rate_limits_hit": []
}
},
...
],
"cursor": "gsGabc789",
"has_next": true,
"stats": {
"read_ops": 9,
"storage_bytes_read": 886,
"compute_ops": 1,
"processing_time_ms": 8,
"rate_limits_hit": []
}
}
Get events after a specific start time
To get events after a specific time, the client driver uses the start_ts
request body parameter:
curl -X POST \
'https://db.fauna.com/feed/1' \
-H 'Authorization: Bearer $FAUNA_SECRET' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"page_size": 10,
"start_ts": 1710968002310000
}'
start_ts
must be later than the creation time of the event source. The
period between the request and the start_ts
can’t exceed the history_days
setting for the source Set’s collection. If history_days
is 0
or unset, the
period is limited to 15 minutes.
Get events after a specific cursor
To get events from a previous event’s cursor, the client driver uses
the cursor
request body parameter. The event source will replay events that
occurred after the cursor (exclusive):
curl -X POST \
'https://db.fauna.com/feed/1' \
-H 'Authorization: Bearer $FAUNA_SECRET' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"cursor": "gsGabc456",
"page_size": 10
}'
The period between the request and the cursor
event’s txn_ts
(transaction
timestamp) can’t exceed the history_days
setting for the source Set’s
collection. If history_days
is 0
or unset, the period is limited to 15
minutes.
Event Streaming
The following Fauna client drivers support real-time Event Streaming:
Follow the links for driver-specific documentation and examples.
How Event Streaming works
To start and subscribe to an event source’s events in real time, the client driver sends a request containing the event source to the Stream HTTP API endpoint:
curl -X POST \
'https://db.fauna.com/stream/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>"
}'
In response, the event source emits a status
event, indicating the
subscription has started.
{
"type": "status",
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
The Stream API
request's connection remains open. If a tracked change occurs, the event source
emits a related add
, remove
, or update
event. These events include the
triggering document, encoded using the
tagged format, in the
data
field:
{
"type": "update",
"data": {
"@doc": {
"id": "392914348360597540",
"coll": { "@mod": "Product" },
"ts": { "@time": "2099-03-21T12:35:18.680Z" },
"name": "pizza",
"description": "Frozen Cheese",
...
}
},
"txn_ts": 1711024518680000,
"cursor": "gsGdef456",
"stats": {
...
}
}
If a change occurs between the creation of the event source and the start of a stream, the stream replays and emits the related events when the stream starts.
Subscription disconnection
Fauna’s client drivers can detect connection loss and automatically reconnect disconnected Event Streaming subscriptions. Events that occur during network issues are replayed and emitted when the subscription reconnects.
When a subscription reconnects, the event source emits a new status
event:
{
"type": "status",
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Restart an Event Streaming subscription
The Stream HTTP API endpoint supports two methods for restarting disconnected stream subscriptions:
The methods are mutually exclusive and can’t be used together.
Restart from an event cursor
To restart a stream from a previous event’s cursor, the client driver
uses the cursor
request body parameter. The restarted stream will replay
events that occurred after the cursor (exclusive):
curl -X POST \
'https://db.fauna.com/stream/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"cursor": "gsGabc123"
}'
The period between the subscription restart and the cursor
event’s txn_ts
(transaction timestamp) can’t exceed the history_days
setting for the source
Set’s collection. If history_days
is 0
or unset, the period is limited to 15
minutes.
Restart from a transaction timestamp
To restart a subscription from a transaction timestamp, the client driver uses
the start_ts
request body parameter. start_ts
is an integer representing the
subscription start time in microseconds since the Unix epoch:
curl -X POST \
'https://db.fauna.com/stream/1' \
-H 'Authorization: Bearer <FAUNA_SECRET>' \
-H 'Content-Type: application/json' \
-d '{
"token": "<EVENT_SOURCE>",
"start_ts": 1710968002310000
}'
start_ts
must be later than the creation time of the event source. The
period between the subscription restart and the start_ts
can’t exceed the
history_days
setting for the source Set’s collection. If history_days
is 0
or unset, the period is limited to 15 minutes.
Event Streaming sample app
The Event Streaming sample app show how you can use Event Streaming to build a real-time chat app. You can use it as a starting point for your own app.
See Event Streaming sample app |
---|
Permission changes
If the authentication secret used to create an event source is revoked or the secret’s privileges change, the stream using the event source closes due to permission loss. This applies even if the secret still has access to the documents the event source is tracking.
Supported Sets
You can only create an event source on a supported Set. The source affects the
exact behavior of set.eventSource()
or set.eventsOn()
.
Supported Set | Behavior |
---|---|
User-defined collection |
You can’t create an event source on a system collection. |
User-defined index |
You can’t create an event source on an index for a system collection. |
|
Collection event sources
Calling set.eventSource()
directly
on collection.all()
tracks
any change to any document in the collection.
The following query tracks any change to documents in the Product
collection:
Product.all().eventSource()
For example, if you change a Product
document’s price
to below 100_00
,
the event source emits an update
event.
You can use
collection.where()
to
filter the tracked documents for a collection.
For example, the following query only tracks
Product
documents with a price
of less than 100_00
.
Product.where(.price < 100_00).eventSource()
If you change a Product
document’s price
from above 100_00
to below
100_00
, the event source emits an add
event. Before the change, the
document would not have been part of the event source’s Set.
You can use set.eventsOn()
to only
track changes to specific fields.
The following query tracks changes made to any Product
document’s
description
. The event source doesn’t emit events for changes to other fields.
Product.all().eventsOn(.description)
Index event sources
Index event sources only emit events for changes to the index’s terms
or
values
fields.
For example, the following Product
collection’s byCategory()
index has:
-
A term field of
category
-
Value fields of
name
andprice
collection Product {
*: Any
index byCategory {
terms [.category]
values [.name, .price]
}
...
}
The following query only tracks changes to the category
, name
, or price
fields for Product
documents with a category
of produce
.
let produce = Category.byName("produce").first()
Product.byCategory(produce).eventSource()
When called on an index,
set.eventsOn()
only accepts the
index’s terms
or values
fields as arguments.
For example, in the following query,
set.eventsOn()
only accepts
.category
, .name
, or .price
as arguments.
let produce = Category.byName("produce").first()
Product.byCategory(produce).eventsOn(.category, .name)
Document event sources
You can use event sources to track changes to a Set containing a single document. These event sources only emit events when the document changes.
Use Set.single()
to create
a Set from a document.
let product = Product.byId(111)!
Set.single(product).eventSource()
Use set.eventsOn()
to only
track changes to specific fields of the document.
let product = Product.byId(111)!
Set.single(product).eventsOn(.name, .price)
Resource deletion
If the database or source for an event source is deleted, the event source won’t emit any further events. Subscriptions for the event source don’t automatically close.
Supported transformations and filters
Event sources only support source Sets that are transformed or filtered using:
This ensures Fauna can convert the Set to an event source. Sets using unsupported transformations or filters will fail to convert.
For example, the Set for the following event source uses the unsupported
set.drop()
method.
Product.all().drop(10).eventSource()
Running the query returns the following error:
invalid_receiver: can't call `.eventSource()` because streaming is not supported on sets returned from `.drop()`.
error: can't call `.eventSource()` because streaming is not supported on sets returned from `.drop()`.
at *query*:1:35
|
1 | Product.all().drop(10).eventSource()
| ^^
|
Filters
Use set.where()
to filter an event
source’s Set.
For example, the following query only tracks changes to Product
documents
with:
-
A
category
ofproduce
-
A
price
less than100_00
let produce = Category.byName("produce").first()
Product
.all()
.where(.category == produce)
.where(.price < 100_00)
.eventSource()
You can also call set.where()
directly on set.eventSource()
or
set.eventsOn()
. The following query
is equivalent to the previous one.
let produce = Category.byName("produce").first()
Product
.all()
.eventSource()
.where(.category == produce)
.where(.price < 100_00)
set.where()
produces a new
Set based on its criteria. The criteria affect the event types
emitted for changes:
-
Creating a document in the Set produces an
add
event. -
Updating a document so that it moves into the Set produces an
add
event. -
Updating a document so that it remains in the Set produces an
update
event. -
Updating a document so that it moves out of the Set produces a
remove
event. -
Deleting a document from the Set produces a
remove
event. -
Any other changes produce no events.
While filters affect events emitted for an event source, they don’t affect event processing, which impacts performance and cost. See How filters affect costs and performance.
Projection
An event source’s add
and update
event types include a data
field. This
field contains the document that triggered the event.
Use set.map()
or
projection to return only specific document
fields in these events.
For example, the following query tracks changes to any field in any Product
document. The query uses set.map()
to only include the name
and price
document fields in the data
field of
add
and update
events.
Product
.all()
.map(product => {
name: product.name,
price: product.price
})
.eventSource()
The following query uses projection and is equivalent to the previous one.
let products = Product.all() { name, price }
products.eventSource()
The previous queries can produce the following add
event. The event’s data
field includes only the name
and price
document fields.
{
"type": "add",
"data": { "name": "pizza", "price": "1599" },
"txn_ts": 1711028312060000,
"cursor": "gsGghu789",
"stats": {
"read_ops": 1,
"storage_bytes_read": 69,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Events
Event sources emit one event per document per transaction.
Event order
Events are ordered by ascending txn_ts
(transaction timestamp). Events from
the same transaction share the same txn_ts
, but their order may differ across
clients.
Event types
The following table outlines supported event types.
Event type | Sent when … |
---|---|
|
A document is added to the Set. |
|
A document is removed from the Set. Event sources don’t emit |
|
A document in the Set changes. |
|
An Event Stream starts or reconnects. It’s also sent periodically to:
See the status event schema. Event Feeds don’t receive or include status events. |
An event source can no longer be consumed due to an error. See the error event schema and Error codes. |
Event schema
Events with a type other than status
or error
have the following schema:
{
"type": "add",
"data": {
"@doc": {
"id": "392914348360597540",
"coll": { "@mod": "Product" },
"ts": { "@time": "2099-03-20T21:46:12.580Z" },
"foo": "bar"
}
},
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
status
event types have the following schema:
{
"type": "status",
"txn_ts": 1710968002310000,
"cursor": "gsGabc123",
"stats": {
"read_ops": 0,
"storage_bytes_read": 0,
"compute_ops": 0,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
error
event types have the following schema:
{
"type": "error",
"error": {
"code": "invalid_stream_start_time",
"message": "Stream start time 2099-09-05T14:27:10.100Z is too far in the past. Recreate the stream and try again."
},
"stats": {
"read_ops": 0,
"storage_bytes_read": 0,
"compute_ops": 0,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Field name | Type | Description | ||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
string |
Event type: Event Feeds don’t receive or include status events. |
||||||||||||||||||
|
object |
Document that triggered the event. FQL values are encoded using the tagged format. The |
||||||||||||||||||
|
object |
Contains an error for the event source. Only
|
||||||||||||||||||
|
integer |
The related transaction’s commit time in microseconds since the Unix epoch. The |
||||||||||||||||||
string |
Cursor for the event. The Fauna HTTP API and client drivers can use the cursor to replay events that occurred after the cursor. See Restart from an event cursor. The |
|||||||||||||||||||
object |
Event statistics.
|
Error codes
The following table outlines possible error codes for error
events.
Error code | Cause |
---|---|
|
An internal error caused by Fauna. |
|
The requested Event Stream or Event Feed start time is too far in the past. The collection containing the stream’s document Set doesn’t retain enough history to replay requested events. |
|
The authentication secret used to create the event source was revoked or the secret’s privileges changed. See Permission changes. |
|
The event source attempts to process more than 128 events at once, exceeding the event limit. |
|
The event source would replay more than 128 events at once, exceeding the event limit. |
Costs and performance
An event source’s cost and performance are closely related to its shape. An event source’s shape is defined by:
-
The source Set
-
Transformations and filters applied to the source Set
Processing and sending events consume Transactional Read Operations (TROs) and Transactional Compute Operations (TCOs).
The exact number of TROs and TCOs consumed varies based on the event source’s shape. See Event Streaming subscriptions in the billing docs.
Depending on its cardinality and throughput, consuming an event source for a large Set may cause delays in event delivery and consume more operations.
If an event source replays events, it may also consume additional operations.
Each event includes statistics that report consumed operations. If you exceed your Fauna’s or your plan’s operations limit, the event source emits an error event. For Event Streams, this closes the stream.
How filters affect costs and performance
Event sources may discard events based on filters.
For example, an event source with the following query uses a filter to only emit
events for Product
documents with a category
of produce
:
let produce = Category.byName("produce").first()
Product
.all()
.where(.category == produce)
.eventSource()
To do this, Fauna processes an event for any change to any Product
document. It then discards events for documents without a category
of
produce
. These discarded events still consume operations for your account.
To track changes on a large Set, we recommend using an index event source.
For example, the following event source emits events similar to the previous
one. However, it only tracks the index’s terms
and values
fields:
let produce = Category.byName("produce").first()
Product
.byCategory(produce)
.eventSource()
Another source of discarded events is privilege predicates in roles. For
example, the following role uses predicates to grant its members read and
write access only to Product
documents with a category
of produce
:
role ProduceManager {
privileges Product {
write {
predicate ((product, _) => product?.category?.name == "produce")
}
read {
predicate (product => product?.category?.name == "produce")
}
}
}
An event source created using an authentication secret with this role only emits events for documents the role can access. Other events are discarded. These discarded events still consume operations for your account.
Limitations
-
Operation limits apply to event sources.
-
While processing events, Fauna runs one query per transaction.
-
An event source can’t replay or process more than 128 events at a time. If an event source has more than 128 events to process, Fauna closes the event source with an error event.
-
You can’t create event sources for:
-
An index for a system collection.
-
A set that combines documents from one or more collections.
Is this article helpful?
Tell Fauna how the article can be improved:
Visit Fauna's forums
or email docs@fauna.com
Thank you for your feedback!