Check out v4 of the Fauna CLI
v4 of the Fauna CLI is now in beta. The new version introduces enhancements to the developer experience, including an improved authentication workflow. To get started, check out the CLI v4 quick start. |
Fauna v10 .NET/C# client driver (current)
Version: 1.0.1 | Repository: fauna/fauna-dotnet |
---|
Fauna’s .NET/C# client driver lets you run FQL queries from .NET and C# applications.
This guide shows how to set up the driver and use it to run FQL queries.
This driver can only be used with FQL v10. It’s not compatible with earlier versions of FQL. To use earlier FQL versions, use the faunadb-csharp package. |
Installation
The driver is available on NuGet. To install it using the .NET CLI, run:
dotnet add package Fauna
API reference
API reference documentation for the driver is available at https://fauna.github.io/fauna-dotnet/.
Sample app
For a practical example, check out the .NET sample app.
This sample app is an e-commerce application that uses the Fauna .NET/C# driver. The source code includes comments highlighting best practices for using the driver and composing FQL queries.
Basic usage
The following applications:
-
Initialize a client instance to connect to Fauna
-
Compose a basic FQL query using an
FQL
string template -
Run the query using
QueryAsync()
orPaginateAsync()
-
Deserialize the results based on a provided type parameter
Use QueryAsync()
to run a non-paginated query:
using Fauna;
using Fauna.Exceptions;
using static Fauna.Query;
try
{
// Initialize the client to connect to Fauna
var config = new Configuration("FAUNA_SECRET")
var client = new Client(config);
// Compose a query
var query = FQL($@"
Product.byName('cups').first() {{
name,
description,
price
}}
");
// Run the query
// Optionally specify the expected result type as a type parameter.
// If not provided, the value will be deserialized as object.
var response = await client.QueryAsync<Dictionary<string, object?>>(query);
Console.WriteLine(response.Data["name"]);
Console.WriteLine(response.Data["description"]);
Console.WriteLine(response.Data["price"]);
Console.WriteLine("--------");
}
catch (FaunaException e)
{
Console.WriteLine(e);
}
Queries that return a Set are automatically paginated. Use
PaginateAsync()
to iterate through paginated results:
using Fauna;
using Fauna.Exceptions;
using static Fauna.Query;
try
{
// Initialize the client to connect to Fauna
var client = new Client("FAUNA_SECRET");
// Compose a query
var query = FQL($@"Category.all() {{ name }}");
// Run the query
// PaginateAsync returns an IAsyncEnumerable of pages
var response = client.PaginateAsync<Dictionary<string, object?>>(query);
await foreach (var page in response)
{
foreach (var product in page.Data)
{
Console.WriteLine(product["name"]);
}
}
}
catch (FaunaException e)
{
Console.WriteLine(e);
}
Connect to Fauna
Each Fauna query is an independently authenticated request to the Query HTTP API endpoint. You authenticate with Fauna with an authentication secret.
Get an authentication secret
Fauna supports several secret types. For testing, you can create a key, which is a type of secret:
-
Log in to the Fauna Dashboard.
-
On the Explorer page, create a database.
-
In the database’s Keys tab, click Create Key.
-
Choose a Role of server.
-
Click Save.
-
Copy the Key Secret. The secret is scoped to the database.
Initialize a client
To send query requests to a Fauna database, initialize a Client
instance using an
authentication secret scoped to the database:
var client = new Client("FAUNA_SECRET");
Client
requires a secret
or configuration
argument. For configuration
options, see Client configuration.
Connect to a child database
A scoped key lets you use a parent database’s admin key to send query requests to its child databases.
For example, if you have an admin key for a parent database and want to
connect to a child database named childDB
, you can create a scoped key using
the following format:
// Scoped key that impersonates an `admin` key for
// the `childDB` child database.
fn...:childDB:admin
You can then initialize a Client
instance using the scoped key:
var client = new Client("fn...:childDB:admin");
Multiple connections
You can use a single client instance to run multiple asynchronous queries at once. The driver manages HTTP connections as needed. Your app doesn’t need to implement connection pools or other connection management strategies.
You can create multiple client instances to connect to Fauna using different credentials or client configurations.
Run FQL queries
Use FQL
string templates to compose FQL queries. Run the queries using
QueryAsync()
or PaginateAsync()
:
// Unpaginated query
var query = FQL($@"Product.byName('cups').first()");
client.QueryAsync(query);
// Paginated query
// Adjust `pageSize()` size as needed
var paginatedQuery = FQL($@"Category.all().pageSize(2)");
client.PaginateAsync(paginatedQuery);
You can only compose FQL queries using string templates.
Variable interpolation
Use single braces {}
to pass native variables to fql queries. Use
{{}}
to escape other single braces in the query.
// Create a native var
var collectionName = "Product";
// Pass the var to an FQL query
var query = FQL($@"
let collection = Collection({collectionName})
collection.byName('cups').first() {{ price }}"
);
client.QueryAsync(query);
The driver encodes interpolated variables to an appropriate FQL type and uses the wire protocol to pass the query to the Core HTTP API’s Query endpoint. This helps prevent injection attacks.
Query composition
You can use variable interpolation to pass FQL string templates as query fragments to compose an FQL query:
// Create a reusable query fragment.
var product = FQL($@"Product.byName(""pizza"").first()");
// Use the fragment in another FQL query.
var query = FQL($@"
let product = {product}
product {{
name,
price
}}
");
client.QueryAsync(query);
POCO mapping
With Fauna.Mapping
, you can map a POCO class to a Fauna document or object
shape:
using Fauna.Mapping;
class Category
{
// Property names are automatically converted to camelCase.
[Id]
public string? Id { get; set; }
// Manually specify a name by providing a string.
[Field("name")]
public string? CatName { get; set; }
}
class Product
{
[Id]
public string? Id { get; set; }
public string? Name { get; set; }
public string? Description { get; set; }
public int Price { get; set; }
// Reference to document
public Ref<Category> Category { get; set; }
}
-
[Id]
: Should only be used once per class on a field namedId
that represents the Fauna document ID. It’s not encoded unless theisClientGenerated
flag is true. -
[Ts]
: Should only be used once per class on a field namedTs
that represents the timestamp of a document. It’s not encoded. -
[Collection]
: Typically goes unmodeled. Should only be used once per class on a field namedColl
that represents the collection field of a document. It will never be encoded. -
[Field]
: Can be associated with any field to override its name in Fauna. -
[Ignore]
: Can be used to ignore fields during encoding and decoding.
You can use POCO classes to deserialize query responses:
var query = FQL($@"Product.sortedByPriceLowToHigh()");
var products = client.PaginateAsync<Product>(query).FlattenAsync();
await foreach (var p in products)
{
Console.WriteLine($"{p.Name} {p.Description} {p.Price}");
}
You can also use POCO classes to write to your database:
var product = new Product {
Id = "12345",
Name = "limes",
Description = "Organic, 2 ct",
Price = 95
};
client.QueryAsync(FQL($@"Product.create({product})"));
DataContext
The DataContext
class provides a schema-aware view of your database. Subclass
it and configure your collections:
class CustomerDb : DataContext
{
public class CustomerCollection : Collection<Customer>
{
public Index<Customer> ByEmail(string email) => Index().Call(email);
public Index<Customer> ByName(string name) => Index().Call(name);
}
public CustomerCollection Customer { get => GetCollection<CustomerCollection>(); }
}
DataContext
provides Client
querying, which automatically maps your
collections to POCO equivalents, even when type hints are not provided.
var db = client.DataContext<CustomerDb>
var result = db.QueryAsync(FQL($"Customer.all().first()"));
var customer = (Customer)result.Data!;
Console.WriteLine(customer.name);
Document references
The driver supports document
references using the Ref<T>
type. There are several ways to work with
document references using the driver:
-
Fetch the reference without loading the referenced document:
// Gets a Product document. // The document's `category` field contains a // reference to a Category document. The // `category` field is not projected. var query = FQL($@" Product.byName('limes').first() "); var response = await client.QueryAsync<Product>(query); var product = response.Data;
-
Project the document reference to load the referenced document:
// Gets a Product document. // The `category` field is projected to load the // referenced document. var query = FQL($@" Product.byName('limes').first() { name, category { name } } "); var response = await client.QueryAsync<Dictionary<string, object?>>(query); var product = response.Data; Console.WriteLine(product["name"]); // Prints the category name. var category = (Dictionary<string, object?>)product["category"]; Console.WriteLine(category["name"]);
-
Use
LoadRefAsync()
to load the referenced document:// Gets a Product document. var query = FQL($@"Product.byName('limes').first()"); var response = await client.QueryAsync<Product>(query); var product = response.Data; // Loads the Category document referenced in // the Product document. var category = await client.LoadRefAsync(product.Category); // Prints the category name. Console.WriteLine(category.Name);
If the reference is already loaded, it returns the cached value without making another query to Fauna:
// This won't run another query if the referenced // document is already loaded. var sameCategory = await client.LoadRefAsync(product.Category);
Null documents
A null document can be handled two ways:
-
Let the driver throw an exception and do something with it:
try { await client.QueryAsync<SomeCollDoc>(FQL($"SomeColl.byId('123')")) } catch (NullDocumentException e) { Console.WriteLine(e.Id); // "123" Console.WriteLine(e.Collection.Name); // "SomeColl" Console.WriteLine(e.Cause); // "not found" }
-
Wrap your expected type in a
Ref<>
orNamedRef
. You can wrapDictionary<string,object>
and POCOs.var q = FQL($"Collection.byName('Fake')"); var r = (await client.QueryAsync<NamedRef<Dictionary<string,object>>>(q)).Data; if (r.Data.Exists) { Console.WriteLine(d.Id); // "Fake" Console.WriteLine(d.Collection.Name); // "Collection" var doc = r.Get(); // A dictionary with id, coll, ts, and any user-defined fields. } else { Console.WriteLine(d.Name); // "Fake" Console.WriteLine(d.Collection.Name); // "Collection" Console.WriteLine(d.Cause); // "not found" r.Get() // this throws a NullDocumentException }
Pagination
When you wish to paginate a set, such as a collection or index, use
PaginateAsync()
.
Example of a query that returns a Set:
var query = FQL($"Customer.all()");
await foreach (var page in client.PaginateAsync<Customer>(query))
{
// handle each page
}
await foreach (var item in client.PaginateAsync<Customer>(query).FlattenAsync())
{
// handle each item
}
Example of a query that returns an object with an embedded Set:
class MyResult
{
[Field("customers")]
public Page<Customer>? Customers { get; set; }
}
var query = FQL($"{{customers: Customer.all()}}");
var result = await client.QueryAsync<MyResult>(query);
await foreach (var page in client.PaginateAsync(result.Data.Customers!))
{
// handle each page
}
await foreach (var item in client.PaginateAsync(result.Data.Customers!).FlattenAsync())
{
// handle each item
}
Query stats
Successful query responses and ServiceException
exceptions include
query stats:
try
{
var client = new Client("FAUNA_SECRET");
var query = FQL($@"'Hello world'");
var response = await client.QueryAsync<string>(query);
Console.WriteLine(response.Stats.ToString());
}
catch (FaunaException e)
{
if (e is ServiceException serviceException)
{
Console.WriteLine(serviceException.Stats.ToString());
Console.WriteLine(e);
}
else {
Console.WriteLine(e);
}
}
Client configuration
The Client
instance comes with reasonable configuration defaults. We recommend
using the defaults in most cases.
If needed, you can configure the client and override the defaults. This also lets you set default Query options.
var config = new Configuration("FAUNA_SECRET")
{
// Configure the client
Endpoint = new Uri("https://db.fauna.com"),
RetryConfiguration = new RetryConfiguration(3, TimeSpan.FromSeconds(20)),
// Set default query options
DefaultQueryOptions = new QueryOptions
{
Linearized = false,
QueryTags = new Dictionary<string, string>
{
{ "tag", "value" }
},
QueryTimeout = TimeSpan.FromSeconds(60),
TraceParent = "00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00",
TypeCheck = false
}
};
var client = new Client(config);
For supported properties, see Fauna.Configuration in the API reference.
Environment variables
By default, the client configuration’s Secret
and Endpoint
default to the
respective FAUNA_SECRET
and FAUNA_ENDPOINT
environment variables.
For example, if you set the following environment variables:
export FAUNA_SECRET=FAUNA_SECRET
export FAUNA_ENDPOINT=https://db.fauna.com/
You can initialize the client with a default configuration:
var client = new Client();
Retries
By default, the client automatically retries query requests that return a
limit_exceeded
error code. Retries
use an exponential backoff.
The client retries a query up to three times by default. The maximum wait time between retries defaults to 20 seconds.
To override these defaults, pass a RetryConfiguration
instance to the
Client configuration.
var config = new Configuration("FAUNA_SECRET")
{
RetryConfiguration = new RetryConfiguration(3, TimeSpan.FromSeconds(20))
};
var client = new Client(config);
For supported parameters, see Fauna.Core.RetryConfiguration in the API reference.
Query options
The Client configuration sets default query options for the following methods:
-
QueryAsync()
-
PaginateAsync()
You can pass a QueryOptions
argument to override these defaults:
var queryOptions = new QueryOptions
{
Linearized = false,
QueryTags = new Dictionary<string, string>
{
{ "tag", "value" }
},
QueryTimeout = TimeSpan.FromSeconds(60),
TraceParent = "00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00",
TypeCheck = true
};
var query = FQL($@"'Hello world'");
client.QueryAsync(query, queryOptions);
For supported properties, see Fauna.Core.QueryOptions in the API reference.
Event Feeds
The driver supports Event Feeds. An Event Feed asynchronously polls an event source for paginated events.
To use Event Feeds, you must have a Pro or Enterprise plan.
Request an Event Feed
To get an event source, append
set.eventSource()
or
set.eventsOn()
to a
supported Set.
To get paginated events, pass the event source to
EventFeedAsync()
:
// Get an event source from a supported Set
EventSource eventSource = await client.QueryAsync<EventSource>(FQL($"Product.all().eventSource()"));
var feed = await client.EventFeedAsync<Product>(eventSource);
If changes occur between the creation of the event source and the Event Feed request, the feed replays and emits any related events.
You can also pass a query that produces an event source directly to
EventFeedAsync()
:
var feed = await client.EventFeedAsync<Product>(FQL($"Product.all().eventSource()"));
If you pass an event source query to EventFeedAsync()
, the driver creates the event
source and requests the event feed at the same time.
In most cases, you’ll get events after a specific event cursor or start time.
Get events after a specific start time
When you first poll an event source using an Event Feed, you usually include a
startTs
(start timestamp) in the FeedOptions
object
that’s passed to EventFeedAsync()
. The request returns events that occurred
after the specified timestamp (exclusive).
startTs
is an integer representing a time in microseconds since the Unix
epoch:
// Calculate timestamp for 10 minutes ago in microseconds
long tenMinutesAgo = DateTimeOffset.UtcNow.AddMinutes(-10).ToUnixTimeMilliseconds() * 1000;
var feedOptions = new FeedOptions(startTs: tenMinutesAgo);
var feed = await client.EventFeedAsync<Product>(FQL($"Product.all().eventSource()", feedOptions));
startTs
must be later than the creation time of the event source. The period
between the request and the startTs
can’t exceed the history_days
setting
for the source Set’s collection. If history_days
is 0
or unset, the period
is limited to 15 minutes.
Get events after a specific event cursor
After the initial request, you usually get subsequent events using the cursor for the last page or event.
To get events after a cursor (exclusive), include the cursor
in the
FeedOptions
object that’s passed to
EventFeedAsync()
:
var feedOptions = new FeedOptions(cursor: "gsGabc456"); // Cursor for a previous page
var feed = await client.EventFeedAsync<Product>(FQL($"Product.all().eventSource()", feedOptions));
You can reuse cursors across event sources with identical queries in the same database.
Iterate on an Event Feed
EventFeedAsync()
returns a FeedEnumerable
instance that acts as an
AsyncEnumerator
. Use foreach()
to iterate through the pages
of events:
await foreach (var page in feed)
{
foreach (var evt in page.Events)
{
Console.WriteLine($"Event Type: {evt.Type}");
Product product = evt.Data;
Console.WriteLine($"Product Name: {product.Name}");
}
}
The FeedEnumerable
will stop when there are no more events to poll.
Each page includes a top-level cursor
. You can include the cursor in a
FeedOptions
object passed to EventFeedAsync()
to poll
for events after the cursor.
Error handling
Exceptions can be raised at two different places:
-
While fetching a page
-
While iterating a page’s events
This distinction allows for you to ignore errors originating from event processing. For example:
try
{
await foreach (var page in feed)
{
try
{
foreach (var evt in page.Events)
{
Console.WriteLine($"Event Type: {evt.Type}");
Product product = evt.Data;
Console.WriteLine($"Product Name: {product.Name}");
}
}
// `EventException` is thrown for event processing errors.
catch (EventException eventError)
{
Console.WriteLine($"Feed event error: {eventError}");
}
}
}
catch (Exception error)
{
Console.WriteLine($"Non-retryable error: {error}");
}
Each page’s cursor
contains the cursor for the page’s last successfully
processed event. If you’re using a loop to poll for changes, using the
cursor will skip any events that caused errors.
Event Feed options
The client configuration sets the default options for EventFeedAsync()
. You
can pass a
FeedOptions
object to override these defaults:
var feedOptions = new FeedOptions(
startTs: 1710968002310000,
pageSize: 10,
cursor: "gsGabc456"
);
var feed = await client.EventFeedAsync<Product>(FQL($"Product.all().eventSource()"), feedOptions);
For supported properties, see
FeedOptions
in the API reference.
Event Streaming
The driver supports Event Streaming.
Start a stream
To get an event source, append
set.eventSource()
or
set.eventsOn()
to a
supported Set.
To stream the source’s events, pass the event source to
SubscribeStream()
:
var query = fql($@"
let set = Customer.all()
{{
initialPage: set.pageSize(10),
eventSource: set.eventSource()
}}
");
var response = await client.QueryAsync(query);
var eventSource = response["eventSource"].ToString();
await using var stream = client.SubscribeStream<Customer>(eventSource);
await foreach (var evt in stream)
{
Console.WriteLine($"Received Event Type: {evt.Type}");
if (evt.Data != null) // Status events won't have Data
{
Customer customer = evt.Data;
Console.WriteLine($"Name: {customer.Name} - Email: {customer.Email}");
}
}
You can also pass a query that produces an event source directly to
EventStreamAsync()
:
var stream = await client.EventStreamAsync<Customer>(FQL($"Customer.all().eventSource()"));
await foreach (var evt in stream)
{
Console.WriteLine($"Received Event Type: {evt.Type}");
if (evt.Data != null)
{
Customer customer = evt.Data;
Console.WriteLine($"Name: {customer.Name} - Email: {customer.Email}");
}
}
Stream options
The Client configuration sets default options for the
SubscribeStream()
and EventStreamAsync()
methods.
You can pass a
StreamOptions
object to override these defaults:
var options = new StreamOptions(
token: "<EVENT_SOURCE>",
cursor: "gsGghu789"
);
var stream = await client.EventStreamAsync<Customer>(
query: FQL("Product.all().eventSource()"),
streamOptions: options
);
await foreach (var evt in stream)
{
Console.WriteLine($"Received Event Type: {evt.Type}");
if (evt.Data != null)
{
Customer customer = evt.Data;
Console.WriteLine($"Name: {customer.Name} - Email: {customer.Email}");
}
}
Debug logging
To enable debug logging, set the FAUNA_DEBUG
environment variable to an
integer for the Microsoft.Extensions.Logging.LogLevel
. For example:
-
0
:LogLevel.Trace
and higher (all messages) -
3
:LogLevel.Warning
and higher
The driver logs HTTP request and response details, including headers. For
security, the Authorization
header is redacted in debug logs but is visible in
trace logs.
As of v1.0.0, the driver only outputs |
For advanced logging, you can use a custom ILogger
implementation, such as
Serilog or NLog. Pass the implementation to the Configuration
class when
instantiating a Client
.
Basic example: Serilog
Install the packages:
dotnet add package Serilog
dotnet add package Serilog.Extensions.Logging
dotnet add package Serilog.Sinks.Console
dotnet add package Serilog.Sinks.File
Configure and use the logger:
using Fauna;
using Microsoft.Extensions.Logging;
using Serilog;
using static Fauna.Query;
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Verbose()
.WriteTo.Console()
.WriteTo.File("log.txt",
rollingInterval: RollingInterval.Day,
rollOnFileSizeLimit: true)
.CreateLogger();
var logFactory = new LoggerFactory().AddSerilog(Log.Logger);
var config = new Configuration("mysecret", logger: logFactory.CreateLogger("myapp"));
var client = new Client(config);
await client.QueryAsync(FQL($"1+1"));
// You should see LogLevel.Debug messages in both the Console and the "log{date}.txt" file
Is this article helpful?
Tell Fauna how the article can be improved:
Visit Fauna's forums
or email docs@fauna.com
Thank you for your feedback!