Skip to content

Enable S3-compatible endpoints

Cloudflare Logpush supports pushing logs to S3-compatible destinations via the Cloudflare dashboard or via API, including:

For more information about Logpush and the current production APIs, refer to Cloudflare Logpush documentation.

Manage via the Cloudflare dashboard

  1. Log in to the Cloudflare dashboard.

  2. Select the Enterprise account or domain (also known as zone) you want to use with Logpush. Depending on your choice, you have access to account-scoped datasets and zone-scoped datasets, respectively.

  3. Go to Analytics & Logs > Logpush.

  4. Select Create a Logpush job.

  1. In Select a destination, choose S3-Compatible.

  2. Enter or select the following destination information:

    • Bucket - S3 Compatible bucket name
    • Path - bucket location within the storage container
    • Organize logs into daily subfolders (recommended)
    • Endpoint URL - The URL without the bucket name or path. Example, sfo2.digitaloceanspaces.com.
    • Bucket region
    • Access Key ID
    • Secret Access Key

When you are done entering the destination details, select Continue.

  1. Select the dataset to push to the storage service.

  2. In the next step, you need to configure your logpush job:

    • Enter the Job name.
    • Under If logs match, you can select the events to include and/or remove from your logs. Refer to Filters for more information. Not all datasets have this option available.
    • In Send the following fields, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.
  3. In Advanced Options, you can:

    • Choose the format of timestamp fields in your logs (RFC3339(default),Unix, or UnixNano).
    • Select a sampling rate for your logs or push a randomly-sampled percentage of logs.
    • Enable redaction for CVE-2021-44228. This option will replace every occurrence of ${ with x{.
  4. Select Submit once you are done configuring your logpush job.

Manage via API

To set up S3-compatible endpoints:

  1. Create a job with the appropriate endpoint URL and authentication parameters.
  2. Enable the job to begin pushing logs.

Ensure Log Share permissions are enabled, before attempting to read or configure a Logpush job. For more information refer to the Roles section.

1. Create a job

To create a job, make a POST request to the Logpush jobs endpoint with the following fields:

  • name (optional) - Use your domain name as the job name.
  • destination_conf - A log destination consisting of an endpoint name, bucket name, bucket path, region, access-key-id, and secret-access-key in the following string format:
Terminal window
"s3://<BUCKET_NAME>/<BUCKET_PATH>?region=<REGION>&access-key-id=<ACCESS_KEY_ID>&secret-access-key=<SECRET_ACCESS_KEY>&endpoint=<ENDPOINT_URL>"
  • dataset - The category of logs you want to receive. Refer to Log fields for the full list of supported datasets.
  • output_options (optional) - To configure fields, sample rate, and timestamp format, refer to Log Output Options.

Example request using cURL:

Terminal window
curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logpush/jobs \
--header "X-Auth-Email: <EMAIL>" \
--header "X-Auth-Key: <API_KEY>" \
--header "Content-Type: application/json" \
--data '{
"name": "<DOMAIN_NAME>",
"destination_conf": "s3://<BUCKET_NAME>/<BUCKET_PATH>?region=<REGION>&access-key-id=<ACCESS_KEY_ID>&secret-access-key=<SECRET_ACCESS_KEY>&endpoint=<ENDPOINT_URL>",
"output_options": {
"field_names": ["ClientIP", "ClientIP", "ClientRequestHost", "ClientRequestMethod", "ClientRequestURI","EdgeEndTimestamp", "EdgeResponseBytes", "EdgeResponseStatus", "EdgeStartTimestamp", "RayID"],
"timestamp_format": "rfc3339"
},
"dataset": "http_requests"
}'

Response:

{
"errors": [],
"messages": [],
"result": {
"id": <JOB_ID>,
"dataset": "http_requests",
"enabled": false,
"name": "<DOMAIN_NAME>",
"output_options": {
"field_names": ["ClientIP", "ClientRequestHost", "ClientRequestMethod", "ClientRequestURI", "EdgeEndTimestamp","EdgeResponseBytes", "EdgeResponseStatus", "EdgeStartTimestamp", "RayID"],
"timestamp_format": "rfc3339"
},
"destination_conf": "s3://<BUCKET_NAME>/<BUCKET_PATH>?region=<REGION>&access-key-id=<ACCESS_KEY_ID>&secret-access-key=<SECRET_ACCESS_KEY>&endpoint=<ENDPOINT_URL>",
"last_complete": null,
"last_error": null,
"error_message": null
},
"success": true
}

2. Enable (update) a job

To enable a job, make a PUT request to the Logpush jobs endpoint. You will use the job ID returned from the previous step in the URL, and send {"enabled": true} in the request body.

Example request using cURL:

Terminal window
curl --request PUT \
https://api.cloudflare.com/client/v4/zones/{zone_id}/logpush/jobs/{job_id} \
--header "X-Auth-Email: <EMAIL>" \
--header "X-Auth-Key: <API_KEY>" \
--header "Content-Type: application/json" \
--data '{
"enabled": true
}'

Response:

{
"errors": [],
"messages": [],
"result": {
"id": <JOB_ID>,
"dataset": "http_requests",
"enabled": true,
"name": "<DOMAIN_NAME>",
"output_options": {
"field_names": ["ClientIP", "ClientRequestHost", "ClientRequestMethod", "ClientRequestURI", "EdgeEndTimestamp","EdgeResponseBytes", "EdgeResponseStatus", "EdgeStartTimestamp", "RayID"],
"timestamp_format": "rfc3339"
},
"destination_conf": "s3://<BUCKET_NAME>/<BUCKET_PATH>?region=<REGION>&access-key-id=<ACCESS_KEY_ID>&secret-access-key=<SECRET_ACCESS_KEY>&endpoint=<ENDPOINT_URL>",
"last_complete": null,
"last_error": null,
"error_message": null
},
"success": true
}