Import your own data into Exabel
The Data API allows you to import your own data into your own private namespace. It also allows you to search for companies through a range of identifiers like ticker/FIGI.
Recommendation: use Import Jobs for data import pipelines
While it is possible to use the API to import data (there is full feature parity), Import Jobs are the easiest way to import data in Exabel. Import Jobs allows you to use SQL on a data warehouse (Snowflake/BigQuery) to query the data to be imported, and then makes the necessary Data API calls.
Exabel's data engineering team uses Import Jobs for all production data pipelines.
For small / ad-hoc data imports, consider using the File Uploader.
Use the Export API to query & retrieve time series data
The Data API is primarily for importing data into Exabel.
Authentication
Authentication is only through API keys
Currently, the Data API is accessible only through a customer-level API key. (Support for user-level API tokens is coming soon.)
Example:
curl --request GET \
--url https://data.api.exabel.com/v1/dataSets \
--header 'accept: application/json' \
--header 'x-api-key: xxx'
Services
DataSetService: Manage data sets owned by your customer organization. Also allows you to list data sets from vendors that you subscribe to.
EntityService : Create and manage entity types and entities (brands, apps, domains, etc) to represent levels of granularity in your data.
RelationshipService: Create and manage relationship types and relationships that link your entities to companies, e.g. to indicate ownership of a brand by a company.
SignalService: Create and manage signals, which are logical grouping of time series with the same meaning. For example, you may have a card_spend
signal that groups together dozens of card spend time series, each belonging to a different company/brand.
TimeSeriesService: Create and manage time series, which contain the actual time series data points themselves.
NamespaceService: List your private customer namespace.
ImportJobService: Allows you to programatically trigger an Import Job, which is the simplest way of importing data from a data warehouse (Snowflake/BigQuery).