Query logs API conventions

Reference: Query log HTTP API reference

You can use the query logs HTTP API to request and download query logs programmatically. This guide covers how to authenticate API requests and fetch logs using the API.

Base URL

Query log HTTP API endpoints use https://account.fauna.com as the base URL.

For example, the full URL for the /api/v1/logs endpoint is https://account.fauna.com/api/v1/logs.

Fauna routes requests to accounts based on the account key secret used for authentication.

Authentication

You authenticate with the query logs HTTP API using an account key’s secret. Account keys are different from keys used for query requests.

You pass the account key’s secret to the API as a bearer token.

Create an account key

You can create an account key in the Fauna Dashboard.

  1. Log in to the Fauna Dashboard and click Account in the left navigation.

  2. Click Account Keys.

  3. Click Create Key.

  4. Enter a Name and an optional TTL. The TTL is the number of days until the account key expires.

  5. Click Create.

  6. Copy the Secret Key. This is the account key’s secret.

You can use the account key’s secret to query logs HTTP API requests.

API workflow

To fetch logs using the API, use the following workflow:

  1. Make a POST request to the /api/v1/logs API endpoint to initiate a request for a set of query logs.

    You must specify:

    • A region_group or database.

      If a database, you must provide a path to the database that includes the Region Group. For example, to request query logs for the ECommerce database in the us-std Region Group, use "database": "us-std/ECommerce".

    • A time_start in ISO 8601 format.

    • A time_end in ISO 8601 format. You can specify a future time. The interval between the time_start and time_end can’t exceed 90 days.

    curl -X POST \
      'https://account.fauna.com/api/v1/logs?type=query' \
      -H 'Authorization: Bearer $ACCOUNT_KEY_SECRET' \
      -H 'Content-Type: application/json' \
      -d '{
        "database": "us-std/ECommerce",
        "time_start": "2099-07-10T07:46:05Z",
        "time_end": "2099-07-11T07:46:05Z"
      }'

    The response includes a request_id you can use to check the status of the request:

    {
      "request_id": "<REQUEST_ID>",
      "state": "Pending",
      "time_start": "2099-07-10T07:46:05Z",
      "time_end": "2099-07-11T07:46:05Z",
      "region_group": "us-std",
      "updated_at": "2099-07-11T21:53:47.398213Z",
      "version": 0,
      "database": "us-std/ECommerce"
    }
  2. Use the request ID to make a GET request to the /api/v1/logs/<REQUEST_ID> endpoint:

    curl -X GET \
      'https://account.fauna.com/api/v1/logs/<REQUEST_ID>?type=query&regionGroup=us-std' \
      -H 'Authorization: Bearer $ACCOUNT_KEY_SECRET'

    The API response includes the query logs request’s current state. You can periodically poll the endpoint until the state is Complete.

    Once Complete, the response includes a presigned_url that links to the compressed log file:

    {
      "request_id": "<REQUEST_ID>",
      "state": "Complete",
      "time_start": "2099-07-10T07:46:05Z",
      "time_end": "2099-07-11T07:46:05Z",
      "region_group": "us-std",
      "updated_at": "2099-07-11T21:53:54.899268Z",
      "version": 2,
      "presigned_url": "https://link.to.your.logs.com",
      "presigned_url_expiration_time": "2099-07-11T22:53:54Z",
      "database": "us-std/ECommerce"
    }

    The presigned_url is valid until the presigned_url_expiration_time.

  3. After the presigned_url_expiration_time, you can get a new log file link by making a POST request to the /api/v1/logs/<REQUEST_ID>/url API endpoint:

    curl -X POST \
      'https://account.fauna.com/api/v1/logs/<REQUEST_ID>/url?type=query&regionGroup=us-std' \
      -H 'Authorization: Bearer $ACCOUNT_KEY_SECRET'
    {
      "presigned_url": "https://new.link.to.your.logs.com",
      "state": "CreatingNewUrl",
      "time_end": "2099-07-11T07:46:05Z",
      "version": 3,
      "region_group": "us-std",
      "time_start": "2099-07-10T07:46:05Z",
      "request_id": "<REQUEST_ID>",
      "presigned_url_expiration_time": "2099-07-11T22:53:54Z",
      "updated_at": "2099-07-11T22:00:32.164516Z"
    }

Example: Programmatically fetch query logs

The following Node.js application programmatically fetches query logs from the us-std Region Group. The app follows the general API workflow.

To run the app:

  1. Create a project directory and navigate to it:

    mkdir query-logs-demo
    cd query-logs-demo
  2. Install app dependencies:

    npm install axios dotenv luxon winston
  3. Create a .env file. In the file, set the ACCOUNT_KEY_SECRET environment variable to an account key secret:

    // .env file
    ACCOUNT_KEY_SECRET=fnacapi...
  4. Create a node.js file and add the following JavaScript to it:

    require('dotenv').config();
    const axios = require('axios').default;
    const winston = require('winston');
    const { DateTime, Duration } = require("luxon");
    
    const faunaClient = axios.create({
      baseURL: "https://account.fauna.com",
      timeout: 10000,
    });
    
    const logger = winston.createLogger({
      level: 'info',
      format: winston.format.json(),
      defaultMeta: { service: 'querylogs-demo' },
      transports: [
        new winston.transports.Console({
          level: "debug",
          handleExceptions: true,
          format: winston.format.json(),
        }),
      ],
    });
    
    const today = DateTime.now().toISO();
    const yesterday = DateTime.now().minus(Duration.fromISO("P1D")).toISO();
    
    async function getLogs() {
      if (process.env["ACCOUNT_KEY_SECRET"] === undefined) {
        throw new Error("You must set ACCOUNT_KEY_SECRET in your local environment to run this program!");
      }
    
      try {
        const headers = { Authorization: `Bearer ${process.env["ACCOUNT_KEY_SECRET"]}` };
        const { data: querylogRequest } = await faunaClient.post(
          "/api/v1/logs?type=query",
          { region_group: "us-std", time_start: yesterday, time_end: today},
          { headers }
        );
        logger.info(querylogRequest);
        return await pollResults(querylogRequest, headers, "us-std");
      } catch (error) {
        logger.error("Error in getLogs:", error);
        throw error;
      }
    }
    
    async function pollResults(
      querylogRequest,
      headers,
      region_group,
    ) {
      let result;
      const maxRuntimeMs = 300 * 1000;
      const time_start = DateTime.now();
      do {
        try {
          ({ data: result } = await faunaClient.get(
            `/api/v1/logs/${querylogRequest.request_id}?regionGroup=${region_group}&type=query`,
            { headers }
          ));
          await new Promise((resolve) => setTimeout(resolve, 1000));
          logger.info(`State: ${result.state}`);
        } catch (error) {
          logger.error("Error in polling:", error);
          throw error;
        }
      } while (
        DateTime.now().diff(time_start).as('milliseconds') < maxRuntimeMs &&
        !["Complete", "DoesNotExist", "Failed", "TimedOut"].includes(result.state)
      );
      logger.info(result);
      return result;
    }
    
    getLogs()
      .then(() => logger.info("Thanks for trying out Fauna logs! Please give us any and all feedback!"))
      .catch(error => logger.error("An error occurred:", error));
  5. Save app.js and run:

    node app.js

    The app prints console logs containing the query logs request’s state. When the query logs are ready for download, it prints a console log containing the presigned URL for the query logs file.

You can extend the app to download the logs and export them to a third-party service for visualization or analysis.

Is this article helpful? 

Tell Fauna how the article can be improved:
Visit Fauna's forums or email docs@fauna.com

Thank you for your feedback!