Send Data

Send (ingest), transport, and fetch data from different sources such as Relational database, web logs, batch data, real-time, application logs, streaming data, etc. for later usage with the Axiom API.

You can also collect, load, group, and move data from one or more sources to Axiom where it can be stored and further analyzed.

Before ingesting data, you need to generate an API Token from the Settings->Tokens page on the Axiom Dashboard. See API Tokens documentation for more detail.


Once you have an API token, there are different ways to get your data into Axiom:

  1. Using the Ingest API;
  2. Using a Data Shipper (Logstash, Filebeat, Metricbeat, Fluentd, etc.);
  3. Using the Elasticsearch Bulk API that Axiom supports natively.
  4. Using the Integrations we support
  5. Using Endpoints

Ingest API

Axiom exports a simple REST API that can accept any of the following formats:

Ingest using JSON

  • application/json - single event or json array of events

Example

curl -X 'POST' 'https://api.axiom.co/v1/datasets/$DATASET_NAME/ingest' \
  -H 'Authorization: Bearer $API_TOKEN' \
  -H 'Content-Type: application/json' \
  -d '[
        {
          "_time":"2021-02-04T03:11:23.222Z",
          "data":{"key1":"value1","key2":"value2"}
        },
        {
          "data":{"key3":"value3"},
          "attributes":{"key4":"value4"}
        },
        {
          "tags": {
            "server": "aws",
            "source": "wordpress"
          }
        }
      ]'

Ingest using NDJSON

  • application/x-ndjson- Ingests multiple JSON objects, each represented as a separate line.

Example

curl -X 'POST' 'https://api.axiom.co/v1/datasets/$DATASET_NAME/ingest' \
  -H 'Authorization: Bearer $API_TOKEN' \
  -H 'Content-Type: application/x-ndjson' \
  -d '{"id":1,"name":"machala"}
  {"id":2,"name":"axiom"}
  {"id":3,"name":"apl"}
  {"index": {"_index": "products"}}
  {"timestamp": "2016-06-06T12:00:00+02:00", "attributes": {"key1": "value1","key2": "value2"}}
  {"queryString": "count()"}'

Ingest using CSV

  • text/csv - this should include the header line with field names separated by commas

Example

curl -X 'POST' 'https://api.axiom.co/v1/datasets/$DATASET_NAME/ingest' \
      -H 'Authorization: Bearer $API_TOKEN' \
      -H 'Content-Type: text/csv' \
      -d 'user, name
         foo, bar'

Supported Libraries

If you would like to instead use a language binding, currently our client libraries are available:


Limits

Limits help prevent potential issues that could arise from the ingestion of excessively large events or data structures that are too complex. They help maintain system performance, allow for effective data processing, and manage resources effectively.

KindLimit
Maximum event size1MB
Maximum events in a batch10000
Maximum field name length200 bytes

Field Restrictions

Axiom imposes restrictions on the number of fields that each dataset can contain.

If a dataset exceeds the permitted field limit, an error will be generated. To prevent this, ensure that the number of fields in any dataset you are ingesting into Axiom doesn't surpass the allowed limit.

Please note that the specific field limit depends on your Axiom pricing tier.


timestamp field

  • _time must conform to a valid timestamp or not be present at all. If _time is not present, the server will assign a timestamp. Axiom accepts many date strings and timestamps without knowing the format in advance, including Unix Epoch, RFC3339, and ISO 8601.

Data Shippers

Configure, read, collect, and send logs to your Axiom deployment using a variety of data shippers. Data shippers are lightweight agents that acquire logs and metrics enabling you to ship data directly into Axiom.


integrations iconAWS CloudFrontgo
integrations iconAWS CloudWatch Logsgo
integrations iconElastic Beatsgo
integrations iconFluent Bitgo
integrations iconFluentDgo
integrations iconHeroku Log Drainsgo
integrations iconKubernetesgo
integrations iconLogstashgo
integrations iconLoki Multiplexergo
integrations iconSyslog Proxygo
integrations iconVectorgo

Integrations

Send logs and metrics from Vercel, Netlify, and other supported integrations we support.

CloudFront overview

Get started with integrations here

Endpoints

Endpoints enable you to easily integrate Axiom into your existing data flow by allowing you to use tools and libraries that you are already familiar with.

You can create an Endpoint for your favorite services like HoneyComb, Jaeger, Grafana Loki, or Splunk and eventually send the logs from these Services directly into Axiom.

CloudFront overview

Get started with endpoints here

Was this page helpful?