Skip to content

Ingesting Data

Before ingesting data, you will need an API Token which you can generate from the Settings -> Tokens page in Axiom. See API Tokens documentation.


Once you have an ingest token, there are three ways to get your data into Axiom:

  1. Using the Ingest API
  2. Using a Data Shipper (Logstash, Filebeats, etc)
  3. Using an Integration
  4. Using the Elasticsearch Bulk API that Axiom supports natively

Ingest API

Axiom exports a simple REST API that can accept any of the following formats:

  • application/json - single event or json array of events
  • application/nd-json
  • text/csv - this should include the header line with field names
  curl -X POST \
    -H "Authorization: Bearer <INGEST_TOKEN>" \
    -H "Content-Type: application/x-ndjson" \
    -d '{ "this": "is", "an": "example", "json": 1 }' \
    https://<axiom-url>/datasets/<my-dataset>/ingest

If you would like to instead use a language binding, currently our Golang client library is available:

More client libraries are coming very soon!

Limits

Kind Limit
Maximum Event Batch Size 1000
Maximum Event Fields 250
Maximum Array Field Items 100

Data Shippers

Further documentation for data shippers is coming soon!

Integrations

Axiom current supports the following integrations:

  • Amazon Web Services
  • Apache Spark
  • Apache Storm
  • Azure
  • CircleCI
  • Docker
  • Elastic
  • Google Cloud Platform
  • GitHub
  • Gitter
  • Hadoop
  • HAProxy
  • Heroku
  • MongoDB
  • MySQL
  • nginx
  • npm
  • Ping Checks
  • PostgreSQL
  • Redis
  • Slack
  • TCP Checks
  • Travis CI

Each integration will autoatically pull events from its corressponding service and direct them into related datasets. From that point, you can use any of the Axiom functionality to analyze, stream, or monitor the integration data.

integrations icon Learn about integrations go

Elasticsearch Bulk API

Further documentation for bulk API is coming soon!