Fauna v10 JVM client driver (current)
Version: 1.0.0 | Repository: fauna/fauna-jvm |
---|
Fauna’s JVM client driver lets you run FQL queries from Java and Scala applications.
This guide shows how to set up the driver and use it to run FQL queries.
This driver can only be used with FQL v10. It’s not compatible with earlier versions of FQL. To use earlier FQL versions, use the faunadb-jvm driver. |
Supported cloud runtimes
-
AWS Lambda (See AWS Lambda connections)
Installation
The driver is available on the Maven central repository. You can add the driver to your Java project using Gradle or Maven.
API reference
API reference documentation for the driver is available at https://fauna.github.io/fauna-jvm/.
Sample app
For a practical example, check out the Java sample app.
This sample app is a production-ready e-commerce application that uses Spring Boot and the Fauna JVM driver. The source code includes comments highlighting best practices for using the driver and composing FQL queries.
Basic usage
The following application:
-
Initializes a client instance to connect to Fauna.
-
Composes a basic FQL query using an FQL string template.
-
Runs the query using
query()
andasyncQuery()
.
package org.example;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import com.fauna.client.Fauna;
import com.fauna.client.FaunaClient;
import com.fauna.exception.FaunaException;
import com.fauna.query.builder.Query;
import com.fauna.response.QuerySuccess;
import com.fauna.types.Page;
import static com.fauna.codec.Generic.pageOf;
import static com.fauna.query.builder.Query.fql;
public class App {
// Define class for `Product` documents
// in expected results.
public static class Product {
public String name;
public String description;
public Integer price;
}
public static void main(String[] args) {
try {
// Initialize a default client.
// It will get the secret from the $FAUNA_SECRET environment variable.
FaunaClient client = Fauna.client();
// Compose a query.
Query query = fql("""
Product.sortedByPriceLowToHigh() {
name,
description,
price
}
""");
// Run the query synchronously.
System.out.println("Running synchronous query:");
runSynchronousQuery(client, query);
// Run the query asynchronously.
System.out.println("\nRunning asynchronous query:");
runAsynchronousQuery(client, query);
} catch (FaunaException e) {
System.err.println("Fauna error occurred: " + e.getMessage());
e.printStackTrace();
} catch (InterruptedException | ExecutionException e) {
e.printStackTrace();
}
}
private static void runSynchronousQuery(FaunaClient client, Query query) throws FaunaException {
// Use `query()` to run a synchronous query.
// Synchronous queries block the current thread until the query completes.
// Accepts the query, expected result class, and a nullable set of query options.
QuerySuccess<Page<Product>> result = client.query(query, pageOf(Product.class));
printResults(result.getData());
}
private static void runAsynchronousQuery(FaunaClient client, Query query) throws ExecutionException, InterruptedException {
// Use `asyncQuery()` to run an asynchronous, non-blocking query.
// Accepts the query, expected result class, and a nullable set of query options.
CompletableFuture<QuerySuccess<Page<Product>>> futureResult = client.asyncQuery(query, pageOf(Product.class));
QuerySuccess<Page<Product>> result = futureResult.get();
printResults(result.getData());
}
// Iterate through the products in the page.
private static void printResults(Page<Product> page) {
for (Product product : page.getData()) {
System.out.println("Name: " + product.name);
System.out.println("Description: " + product.description);
System.out.println("Price: " + product.price);
System.out.println("--------");
}
// Print the `after` cursor to paginate through results.
System.out.println("After: " + page.getAfter());
}
}
Connect to Fauna
Each Fauna query is an independently authenticated request to the Core HTTP API’s Query endpoint. You authenticate with Fauna using an authentication secret.
Get an authentication secret
Fauna supports several secret types. For testing, you can create a key, which is a type of secret:
-
Log in to the Fauna Dashboard.
-
On the Explorer page, create a database.
-
In the database’s Keys tab, click Create Key.
-
Choose a Role of server.
-
Click Save.
-
Copy the Key Secret. The secret is scoped to the database.
Initialize a client
To send query requests to Fauna, initialize a FaunaClient
instance with a
Fauna authentication secret. You can pass the secret in a FaunaConfig
object:
FaunaConfig config = FaunaConfig.builder().secret("FAUNA_SECRET").build();
FaunaClient client = Fauna.client(config);
For supported properties, see FaunaConfig.Builder in the API reference.
Use an environment variable
If not specified, secret
defaults to the FAUNA_SECRET
environment variable.
For example:
// Defaults to the secret in the `FAUNA_SECRET` env var.
FaunaClient client = Fauna.client();
Connect locally
The client comes with a helper config for connecting to Fauna running locally.
// Connects to Fauna running locally via Docker (http://localhost:8443 and secret "secret").
FaunaClient local = Fauna.local();
Scoped client
You can scope a client to a specific database and role. Scoped clients require a
key secret with the built-in admin
role. The
driver uses this key to create a
scoped key internally.
FaunaClient db1 = Fauna.scoped(client, FaunaScope.builder("Database1").build());
FaunaScope scope2 = FaunaScope.builder("Database2").withRole(FaunaRole.named("MyRole")).build();
FaunaClient db2 = Fauna.scoped(client, scope2);
Multiple connections
You can use a single client instance to run multiple asynchronous queries at once. The driver manages HTTP connections as needed. Your app doesn’t need to implement connection pools or other connection management strategies.
You can create multiple client instances to connect to Fauna using different secrets or client configurations.
AWS Lambda connections
AWS Lambda freezes, thaws, and reuses execution environments for Lambda functions. See Lambda execution environment.
When an execution environment is thawed, Lambda only runs the function’s handler code. Objects declared outside of the handler method remain initialized from before the freeze. Lambda doesn’t re-run initialization code outside the handler.
Fauna drivers keep socket connections that can time out during long freezes,
causing ECONNRESET
errors when thawed.
To prevent timeouts, create Fauna client connections inside function handlers. Fauna drivers use lightweight HTTP connections. You can create new connections for each request while maintaining good performance.
Run FQL queries
Use fql
string templates to compose FQL queries. To run the query, pass the
template and an expected result class to query()
or asyncQuery()
:
Query query = fql("Product.sortedByPriceLowToHigh()");
QuerySuccess<Page<Product>> result = client.query(query, pageOf(Product.class));
You can also pass a nullable set of query options to
query()
or asyncQuery()
. These options control how the query runs in Fauna.
See Query options.
You can only compose FQL queries using string templates.
Define a custom class for your data
Use annotations to map a Java class to a Fauna document or object shape:
import com.fauna.annotation.FaunaField;
import com.fauna.annotation.FaunaId;
class Person {
@FaunaId
private String id;
private String firstName;
@FaunaField( name = "dob")
private String dateOfBirth;
}
You can use the com.fauna.annotation
package to modify encoding and decoding
of specific fields in classes used as arguments and results of queries:
-
@FaunaId
: Should only be used once per class and be associated with a field namedid
that represents the Fauna document ID. It’s not encoded unless theisClientGenerated
flag istrue
. -
@FaunaTs
: Should only be used once per class and be associated with a field namedts
that represents the timestamp of a document. It’s not encoded. -
@FaunaColl
: Typically goes unmodeled. Should only be used once per class and be associated with a field namedcoll
that represents the collection field of a document. It will never be encoded. -
@FaunaField
: Can be associated with any field to override its name in Fauna. -
@FaunaIgnore
: Can be used to ignore fields during encoding and decoding.
Use classes in the com.fauna.codec
package to handle type erasure when the
top-level result of a query is a generic, including:
-
PageOf<T>
whereT
is the element type. -
ListOf<T>
whereT
is the element type. -
MapOf<T>
whereT
is the value type. -
OptionalOf<T>
whereT
is the value type. -
NullableDocumentOf<T>
whereT
is the value type. This is specifically for cases when you return a Fauna document that may be null and want to receive a concreteNullDocument<T>
orNonNullDocument<T>
instead of catching aNullDocumentException
.
Variable interpolation
Use ${}
to pass native Java variables to FQL. You can escape a variable by
prepending an additional $
.
// Create a native Java var.
var collectionName = "Product";
// Pass the var to an FQL query.
Query query = fql("""
let collection = Collection(${collectionName})
collection.sortedByPriceLowToHigh()
""",
Map.of(
"collectionName", collectionName
));
The driver encodes interpolated variables to an appropriate FQL type and uses the wire protocol to pass the query to the Core HTTP API’s Query endpoint. This helps prevent injection attacks.
Query composition
You can use variable interpolation to pass FQL string templates as query fragments to compose an FQL query:
// Create a reusable query fragment.
Query product = fql("Product.byName('pizza').first()");
// Prepare arguments for the query.
Map<String, Object> queryArgs = Map.of("product", product);
// Use the fragment in another FQL query.
Query query = fql("""
let product = ${product}
product {
name,
price
}
""", queryArgs);
Pagination
Use paginate()
to asynchronously iterate through Sets that contain more than
one page of results.
paginate()
accepts the same query options as query()
and
asyncQuery()
.
import com.fauna.client.Fauna;
import com.fauna.client.FaunaClient;
import com.fauna.client.PageIterator;
public class App {
public static void main(String[] args) {
FaunaClient client = Fauna.client();
// Paginate will make an async request to Fauna.
PageIterator<Product> iter1 = client.paginate(fql("Product.all()"), Product.class);
// Handle each page. `PageIterator` extends the Java Iterator interface.
while (iter1.hasNext()) {
Page<Product> page = iter1.next();
List<Product> pageData = page.data();
// Do something with your data.
}
PageIterator<Product> iter2 = client.paginate(fql("Product.all()"), Product.class);
// You can use `flatten()` on `PageIterator` to iterate over every
// element in a Set.
Iterator<Product> productIter = iter2.flatten();
List<Product> products = new ArrayList<>();
// Iterate over Product elements without worrying about pages.
iter2.forEachRemaining((Product p) -> products.add(p));
}
}
Query stats
Successful query responses and ServiceException
exceptions include query
stats:
package org.example;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import com.fauna.client.Fauna;
import com.fauna.client.FaunaClient;
import com.fauna.exception.FaunaException;
import com.fauna.exception.ServiceException;
import com.fauna.query.builder.Query;
import static com.fauna.query.builder.Query.fql;
import com.fauna.response.QueryResponse;
import com.fauna.response.QuerySuccess;
public class App {
public static void main(String[] args) {
try {
FaunaClient client = Fauna.client();
Query query = fql("'Hello world'");
CompletableFuture<QuerySuccess<String>> futureResponse = client.asyncQuery(query, String.class);
QueryResponse response = futureResponse.get();
System.out.println(response.getStats().toString());
} catch (FaunaException e) {
if (e instanceof ServiceException) {
ServiceException serviceException = (ServiceException) e;
System.out.println(serviceException.getStats().toString());
}
System.out.println(e);
} catch (InterruptedException | ExecutionException e) {
e.printStackTrace();
}
}
}
Client configuration
You can pass a FaunaConfig
object to customize the configuration of a
FaunaClient
instance.
FaunaConfig config = new FaunaConfig.Builder()
.secret("<FAUNA_SECRET>")
.build();
FaunaClient client = Fauna.client(config);
For properties, see FaunaConfig.Builder in the API reference.
Environment variables
By default, secret
and endpoint
default to the respective FAUNA_SECRET
and
FAUNA_ENDPOINT
environment variables.
For example, if you set the following environment variables:
export FAUNA_SECRET=FAUNA_SECRET
export FAUNA_ENDPOINT=https://db.fauna.com/
You can initialize the client with a default configuration:
FaunaClient client = Fauna.client();
Query options
You can pass a QueryOptions
object to query()
or asyncQuery()
to control
how a query runs in Fauna. You can also use query options to instrument
a query for monitoring and debugging.
Query query = Query.fql("Hello World");
QueryOptions options = QueryOptions.builder()
.linearized(true)
.queryTags(Map.of("tag", "value"))
.timeout(Duration.ofSeconds(10))
.traceParent("00-750efa5fb6a131eb2cf4db39f28366cb-000000000000000b-00")
.typeCheck(false)
.build();
QuerySuccess result = client.query(query, String.class, options);
For properties, see QueryOptions.Builder in the API reference.
Event Feeds
The driver supports Event Feeds. An Event Feed asynchronously polls an event source for paginated events.
To use Event Feeds, you must have a Pro or Enterprise plan.
Request an Event Feed
To get an event source, append
set.eventSource()
or
set.eventsOn()
to a
supported Set.
To get an event feed, you can use one of the following methods:
-
feed()
: Synchronously fetches an event feed and returns aFeedIterator
that you can use to iterate through the pages of events. -
asyncFeed()
: Asynchronously fetches an event feed and returns aCompletableFuture<FeedIterator>
that you can use to iterate through the pages of events. -
poll()
: Asynchronously fetches a single page of events from the event feed and returns aCompletableFuture<FeedPage>
that you can use to handle each page individually. You can repeatedly callpoll()
to get successive pages.
You can use flatten()
on a FeedIterator
to iterate through events rather than pages.
import com.fauna.client.Fauna;
import com.fauna.client.FaunaClient;
import com.fauna.event.FeedIterator;
import com.fauna.event.EventSource;
import com.fauna.event.FeedOptions;
import com.fauna.event.FeedPage;
import com.fauna.event.EventSource;
import com.fauna.response.QuerySuccess;
import com.fauna.event.FaunaEvent;
import java.util.List;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.concurrent.CompletableFuture;
import static com.fauna.query.builder.Query.fql;
// Import the Product class for event data.
import org.example.Product;
public class EventFeedExample {
private static void printEventDetails(FaunaEvent<Product> event) {
System.out.println("Event Details:");
System.out.println(" Type: " + event.getType());
System.out.println(" Cursor: " + event.getCursor());
event.getTimestamp().ifPresent(ts ->
System.out.println(" Timestamp: " + ts)
);
event.getData().ifPresent(product ->
System.out.println(" Product: " + product.toString())
);
if (event.getStats() != null) {
System.out.println(" Stats: " + event.getStats());
}
if (event.getError() != null) {
System.out.println(" Error: " + event.getError());
}
System.out.println("-------------------");
}
public static void main(String[] args) {
FaunaClient client = Fauna.client();
long tenMinutesAgo = System.currentTimeMillis() * 1000 - (10 * 60 * 1000 * 1000);
FeedOptions options = FeedOptions.builder()
.startTs(tenMinutesAgo)
.pageSize(10)
.build();
// Example 1: Using `feed()`
FeedIterator<Product> syncIterator = client.feed(
fql("Product.all().eventsOn(.price, .stock)"),
options,
Product.class
);
System.out.println("----------------------");
System.out.println("`feed()` results:");
System.out.println("----------------------");
syncIterator.forEachRemaining(page -> {
for (FaunaEvent<Product> event : page.getEvents()) {
printEventDetails(event);
}
});
// Example 2: Using `asyncFeed()`
CompletableFuture<FeedIterator<Product>> iteratorFuture = client.asyncFeed(
fql("Product.all().eventsOn(.price, .stock)"),
options,
Product.class
);
FeedIterator<Product> iterator = iteratorFuture.join();
System.out.println("----------------------");
System.out.println("`asyncFeed()` results:");
System.out.println("----------------------");
iterator.forEachRemaining(page -> {
for (FaunaEvent<Product> event : page.getEvents()) {
printEventDetails(event);
}
});
// Example 3: Using `flatten()` on a `FeedIterator`
FeedIterator<Product> flattenedIterator = client.feed(
fql("Product.all().eventSource()"),
options,
Product.class
);
Iterator<FaunaEvent<Product>> eventIterator = flattenedIterator.flatten();
List<FaunaEvent<Product>> allEvents = new ArrayList<>();
eventIterator.forEachRemaining(allEvents::add);
System.out.println("----------------------");
System.out.println("`flatten()` results:");
System.out.println("----------------------");
for (FaunaEvent<Product> event : allEvents) {
printEventDetails(event);
}
// Example 4: Using `poll()`
QuerySuccess<EventSource> sourceQuery = client.query(
fql("Product.all().eventSource()"),
EventSource.class
);
EventSource source = EventSource.fromResponse(sourceQuery.getData());
CompletableFuture<FeedPage<Product>> pageFuture = client.poll(
source,
options,
Product.class
);
while (pageFuture != null) {
FeedPage<Product> page = pageFuture.join();
List<FaunaEvent<Product>> events = page.getEvents();
System.out.println("----------------------");
System.out.println("`poll()` results:");
System.out.println("----------------------");
for (FaunaEvent<Product> event : events) {
printEventDetails(event);
}
if (page.hasNext()) {
FeedOptions nextPageOptions = options.nextPage(page);
pageFuture = client.poll(source, nextPageOptions, Product.class);
} else {
pageFuture = null;
}
}
}
}
If you pass an event source directly to feed()
or poll()
and changes occur
between the creation of the event source and the Event Feed request, the feed
replays and emits any related events.
In most cases, you’ll get events after a specific start time or cursor.
Get events after a specific start time
When you first poll an event source using an Event Feed, you usually include a
startTs
(start timestamp) in the FeedOptions
passed to feed()
,
asyncFeed()
, or poll()
.
startTs
is an integer representing a time in microseconds since the Unix
epoch. The request returns events that occurred after the specified timestamp
(exclusive).
Query query = fql("Product.all().eventsOn(.price, .stock)");
// Calculate the timestamp for 10 minutes ago in microseconds.
long tenMinutesAgo = System.currentTimeMillis() * 1000 - (10 * 60 * 1000 * 1000);
FeedOptions options = FeedOptions.builder()
.startTs(tenMinutesAgo)
.pageSize(10)
.build();
// Example 1: Using `feed()`
FeedIterator<Product> syncIterator = client.feed(
query,
options,
Product.class
);
// Example 2: Using `asyncFeed()`
CompletableFuture<FeedIterator<Product>> iteratorFuture = client.asyncFeed(
query,
options,
Product.class
);
// Example 3: Using `poll()`
QuerySuccess<EventSource> sourceQuery = client.query(
query,
EventSource.class
);
EventSource source = EventSource.fromResponse(sourceQuery.getData());
CompletableFuture<FeedPage<Product>> pageFuture = client.poll(
source,
options,
Product.class
);
Get events after a specific cursor
After the initial request, you usually get subsequent events using the cursor
for the last page or event. To get events after a cursor (exclusive), include
the cursor in the FeedOptions
passed to feed()
, asyncFeed()
, or poll()
:
Query query = fql("Product.all().eventsOn(.price, .stock)");
FeedOptions options = FeedOptions.builder()
.cursor("gsGabc456") // Cursor for the last page
.pageSize(10)
.build();
// Example 1: Using `feed()`
FeedIterator<Product> syncIterator = client.feed(
query,
options,
Product.class
);
// Example 2: Using `asyncFeed()`
CompletableFuture<FeedIterator<Product>> iteratorFuture = client.asyncFeed(
query,
options,
Product.class
);
// Example 3: Using `poll()`
QuerySuccess<EventSource> sourceQuery = client.query(
query,
EventSource.class
);
EventSource source = EventSource.fromResponse(sourceQuery.getData());
CompletableFuture<FeedPage<Product>> pageFuture = client.poll(
source,
options,
Product.class
);
Error handling
Exceptions can be raised in two different places:
-
While fetching a page
-
While iterating a page’s events
This distinction lets ignore errors originating from event processing. For example:
try {
FeedIterator<Product> syncIterator = client.feed(
fql("Product.all().map(.details.toUpperCase()).eventSource()"),
options,
Product.class
);
syncIterator.forEachRemaining(page -> {
try {
for (FaunaEvent<Product> event : page.getEvents()) {
// Event-specific handling
System.out.println("Event: " + event);
}
} catch (FaunaException e) {
// Handle errors for specific events within the page
System.err.println("Error processing event: " + e.getMessage());
}
});
} catch (FaunaException e) {
// Additional handling for initialization errors
System.err.println("Error occurred with event feed initialization: " + e.getMessage());
}
Event Streaming
The driver supports Event Streaming.
To get an event source, append
set.eventSource()
or
set.eventsOn()
to a
supported Set.
To start and subscribe to the stream, pass an EventSource
and related
StreamOptions
to stream()
or asyncStream()
:
// Get an event source.
Query query = fql("Product.all().eventSource() { name, stock }");
QuerySuccess<EventSource> tokenResponse = client.query(query, EventSource.class);
EventSource eventSource = EventSource.fromResponse(querySuccess.getData());
// Calculate the timestamp for 10 minutes ago in microseconds.
long tenMinutesAgo = System.currentTimeMillis() * 1000 - (10 * 60 * 1000 * 1000);
StreamOptions streamOptions = StreamOptions.builder().startTimestamp(tenMinutesAgo).build();
// Example 1: Using `stream()`
FaunaStream<Product> stream = client.stream(eventSource, streamOptions, Product.class);
// Example 2: Using `asyncStream()`
CompletableFuture<FaunaStream<Product>> futureStream = client.asyncStream(source, streamOptions, Product.class);
If changes occur between the creation of the event source and the stream request, the stream replays and emits any related events.
Alternatively, you can pass an FQL query that returns an event source to
stream()
or asyncStream()
:
Query query = fql("Product.all().eventSource() { name, stock }");
// Example 1: Using `stream()`
FaunaStream<Product> stream = client.stream(query, Product.class);
// Example 2: Using `asyncStream()`
CompletableFuture<FaunaStream<Product>> futureStream = client.asyncStream(query, Product.class);
Create a subscriber class
The methods return a FaunaStream
publisher that lets you handle events as they
arrive. Create a class with the Flow.Subscriber
interface to process
events:
package org.example;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.Flow;
import java.util.concurrent.atomic.AtomicInteger;
import com.fauna.client.Fauna;
import com.fauna.client.FaunaClient;
import com.fauna.event.FaunaEvent;
import com.fauna.event.FaunaStream;
import com.fauna.exception.FaunaException;
import static com.fauna.query.builder.Query.fql;
// Import the Product class for event data.
import org.example.Product;
public class EventStreamExample {
public static void main(String[] args) throws InterruptedException {
try {
FaunaClient client = Fauna.client();
// Create a stream of all products. Project the name and stock.
FaunaStream<Product> stream = client.stream(fql("Product.all().eventSource() { name, stock }"), Product.class);
// Create a subscriber to handle stream events.
ProductSubscriber subscriber = new ProductSubscriber();
stream.subscribe(subscriber);
// Wait for the subscriber to complete.
subscriber.awaitCompletion();
} catch (FaunaException e) {
System.err.println("Fauna error occurred: " + e.getMessage());
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
static class ProductSubscriber implements Flow.Subscriber<FaunaEvent<Product>> {
private final AtomicInteger eventCount = new AtomicInteger(0);
private Flow.Subscription subscription;
private final int maxEvents;
private final CountDownLatch completionLatch = new CountDownLatch(1);
public ProductSubscriber() {
// Stream closes after 3 events.
this.maxEvents = 3;
}
@Override
public void onSubscribe(Flow.Subscription subscription) {
this.subscription = subscription;
subscription.request(1);
}
@Override
public void onNext(FaunaEvent<Product> event) {
// Handle each event...
int count = eventCount.incrementAndGet();
System.out.println("Received event " + count + ":");
System.out.println(" Type: " + event.getType());
System.out.println(" Cursor: " + event.getCursor());
System.out.println(" Timestamp: " + event.getTimestamp());
System.out.println(" Data: " + event.getData().orElse(null));
if (count >= maxEvents) {
System.out.println("Closing stream after " + maxEvents + " events");
subscription.cancel();
completionLatch.countDown();
} else {
subscription.request(1);
}
}
@Override
public void onError(Throwable throwable) {
System.err.println("Error in stream: " + throwable.getMessage());
completionLatch.countDown();
}
@Override
public void onComplete() {
System.out.println("Stream completed.");
completionLatch.countDown();
}
public int getEventCount() {
return eventCount.get();
}
public void awaitCompletion() throws InterruptedException {
completionLatch.await();
}
}
}
Debug logging
To log the driver’s HTTP requests and responses, set the FAUNA_DEBUG
environment variable to 1
. The driver outputs requests and responses,
including headers, to stderr
. You can also use your logger.
Setting Level.WARNING
is equivalent to FAUNA_DEBUG=0
. Setting Level.FINE
is equivalent to FAUNA_DEBUG=1
. The driver logs HTTP request bodies at
Level.FINEST
.
import java.util.logging.ConsoleHandler;
import java.util.logging.Handler;
import java.util.logging.Level;
import java.util.logging.SimpleFormatter;
import com.fauna.client.Fauna;
import com.fauna.client.FaunaClient;
class App {
public static void main(String[] args) {
Handler handler = new ConsoleHandler();
handler.setLevel(Level.FINEST);
handler.setFormatter(new SimpleFormatter());
FaunaClient client = Fauna.client(FaunaConfig.builder().logHandler(handler).build());
}
}
Is this article helpful?
Tell Fauna how the article can be improved:
Visit Fauna's forums
or email docs@fauna.com
Thank you for your feedback!