Download OpenAPI specification:Download
This is the reference documentation for the Cognite API with an overview of all the available methods.
Most resource types can be paginated, indicated by the field nextCursor
in the response.
By passing the value of nextCursor
as the cursor you will get the next page of limit
results.
Note that all parameters except cursor
has to stay the same.
If you want to download a lot of resources (let's say events), paginating through millions of records can be slow.
We support parallel retrieval through the partition
parameter, which has the format m/n
where n
is the amount of partitions you would like to split the entire data set into.
If you want to download the entire data set by splitting it into 10 partitions, do the following in parallel with m
running from 1 to 10:
/events
with partition=m/10
.partition
parameter needs to be passed to all subqueries.
Processing of parallel retrieval requests is subject to concurrency quota availability. The request returns the 429
response upon exceeding concurrency limits. See the Request throttling chapter below.To prevent unexpected problems and maximize read throughput, you should at most use 10 partitions.
Some CDF resources will automatically enforce a maximum of 10 partitions.
For more specific and detailed information, please read the partition
attribute documentation for the CDF resource you're using.
Cognite Data Fusion (CDF) returns the HTTP 429
(too many requests) response status code when project capacity exceeds the limit.
The throttling can happen:
Cognite recommends using a retry strategy based on truncated exponential backoff to handle sessions with HTTP response codes 429.
Cognite recommends using a reasonable number (up to 10) of Parallel retrieval
partitions.
Following these strategies lets you slow down the request frequency to maximize productivity without having to re-submit/retry failing requests.
See more here.
This API uses calendar versioning, and version names follow the YYYYMMDD
format.
You can find the versions currently available by using the version selector at the top of this page.
To use a specific API version, you can pass the cdf-version: $version
header along with your requests to the API.
The beta versions provide a preview of what the stable version will look like in the future. Beta versions contain functionality that is reasonably mature, and highly likely to become a part of the stable API.
Beta versions are indicated by a -beta
suffix after the version name. For example, the beta version header for the
2023-01-01 version is then cdf-version: 20230101-beta
.
Alpha versions contain functionality that is new and experimental, and not guaranteed to ever become a part of the stable API. This functionality presents no guarantee of service, so its use is subject to caution.
Alpha versions are indicated by an -alpha
suffix after the version name. For example, the alpha version header for
the 2023-01-01 version is then cdf-version: 20230101-alpha
.
Identity providers (IdP) are required to be compatible with the OpenID Connect Discovery 1.0 standard, and compliance will now be enforced by the Projects API.
oidcConfiguration.jwksUrl
and oidcConfiguration.tokenUrl
can be entirely omitted when updating the OIDC configuration for a project.oidcConfiguration.jwksUrl
and oidcConfiguration.tokenUrl
are preserved for backwards compatibility of the API. However, if these are specified as part of the request body, the value must match excatly the values that are specified in the OpenID provider configuration document for the configured issuer (can be found at https://{issuer-url}/.well-known/openid-configuration
). If the values does not match, the API will return an error message.The oidcConfiguration.skewMs
has been deprecated but remains part of the API for backwards compatibility. It can be omitted from the request. If included, it must always be set to 0
.
The oidcConfiguration.isGroupCallbackEnabled
has been deprecated but remains part of the API for backwards compatibility. It can be omitted from the request.
true
.autoCreateDirectRelations
option on the endpoint for ingesting instances.
This option lets the user specify whether to create missing target nodes of direct relations.sources
field on the /instances/byids
endpoint.image.InstanceLink
and diagrams.InstanceLink
annotation types to allow you to link from objects discovered in images and engineering diagrams to data model instances.Fixed the API documentation for the request body of the POST /projects/{project}/sessions/byids endpoint.
The documentation incorrectly stated the request body schema as specifying the list of session IDs to retrieve, in the form {"items": [42]}
- it should in fact be {"items": [{"id": 42}]}
. The documentation has been updated to reflect this.
Fixed the API documentation for the response body of the POST /projects/{project}/sessions/byids endpoint.
The documentation incorrectly stated nextCursor
and previousCursor
fields as being returned from the response, which was not the case, and these fields have now been removed from the API documentation.
nodes
and edges
.highlight
field in the search
endpoint to indicate whether matches in search results should be highlighted.We've removed authentication via CDF service accounts and API keys, and user sign-in via /login
.
POST /documents/aggregate
endpoint. The endpoint allows you to count documents optionally grouped by a property and also to retrieve all unique values of a property.POST /documents/list
endpoint. The endpoint allows you to iterate through all the documents in a project.POST /documents/{documentId}/content
endpoint. The endpoint lets you download the entire extracted plain text of a document.isStep
parameter to be editable (i.e. removed description stating it is not updatable) in POST /timeseries/create.isStep
parameter to the TimeSeriesPatch
object used in POST /timeseries/updateignoreUnknownIds
parameter to POST /sequences/delete. Setting this to true will prevent the operation from failing if one or more of the given sequences do not exist; instead, those given sequences that do exist will be deleted.documentation
attribute that supports Markdown (rendered as Markdown in Fusion).success
, failure
and seen
. They enable extractor developers to report status and error message after ingesting data. As well enables for reporting heartbeat through seen
status by the extractor to easily identify issues related to crushed applications and scheduling issues.partition
parameter to the GET /sequences endpoint to support parallel retrieval.partition
parameter to the GET /timeseries endpoint to support parallel retrieval.Added sessions to v1. Sessions let you securely delegate access to CDF resources for CDF services (such as Functions) by an external principal and for an extended time.
remove
columns, modify
existing columns, and add
new columns as well.You can now ask for a granularity of up to 100000 hours (previously 48 hours), both in normal aggregates and in synthetic time series.
We are deprecating authentication via CDF service accounts and API keys, and user sign-in via /login
, in favor of registering applications and services with your IdP (identity provider) and using OpenID Connect and the IdP framework to manage CDF access securely.
The legacy authentication flow is available for customers using Cognite Data Fusion (CDF) on GCP until further notice. We strongly encourage customers to adopt the new authentication flows as soon as possible.
The following API endpoints are deprecated:
/api/v1/projects/*/apikeys
/api/v1/projects/*/serviceaccounts
/login
/logout
/api/v1/projects/*/groups/serviceaccounts
**only the sub-resources for listing, adding, and removing members of groups.
CDF API 0.5, 0.6 reached their end-of-life after its initial deprecation announcement in Summer 2019.
partition
parameter to the List 3D Nodes endpoint for supporting parallel requests.sortByNodeId
parameter to the List 3D Nodes endpoint, improving request latency in most cases if set to true
.status
shall be capitalized string.fileType
inside derivedFields
to refer to a pre-defined subset of MIME types.fileType
inside derivedFields
to find files with a pre-defined subset of MIME types.geoLocation
to refer to the geographic location of the file.geoLocation
to find files matching a certain geographic location.To learn how to leverage new geoLocation features, follow our guide.
directory
referring to the directory in the source containing the file.directoryPrefix
allows you to find Files matching a certain directory prefix.labels
allows you to attach labels to Files upon creation or updating.labels
allows you to find Files that have been annotated with specific labels.applicationDomains
. If this field is set, users only sign in to the project through applications hosted on
a whitelisted domain. Read more.uniqueValues
allows you to find different types, subtypes of events in your project.labels
allows you to find resources that have been annotated with specific labels.endTime=null
.datasetId
introduced in assets, files, events, time series and sequences.dataSetIds
allows you to narrow down results to resources containing datasetId
by a list of ids or externalIds of a data set. Supported by assets, files, events, time series and sequences.count
.datasetsAcl
for managing access to data set resources.datasetScope
for assets, files, events, time series and sequences ACLs. Allows you to scope down access to resources contained within a specified set of data sets.count
.count
.depth
and path
. You can use the properties in the filter and retrieve endpoints.parentExternalId
which is returned for all assets which have a parent with a defined externalId
.assetSubtreeIds
as a parameter to filter, search, and list endpoints for all core resources. assetSubtreeIds
allows you to specify assets that are subtree roots, and then only retrieve resources that are related to assets within those subtrees.search.query
parameter. This uses an improved search algorithm that tries a wider range of variations of the input terms and gives much better relevancy ranking than the existing search.name
and search.description
fields.search.query
parameter for time series search now uses an improved search algorithm that tries a wider range of variations of the input terms, and gives much better relevancy ranking.mimeType
for existing files in files/update requests.Time series expanded their filtering capabilities with new Filter time series
endpoint, allowing for additional filtering by:
Endpoint in addition support pagination and partitioning. Check out detailed API documentation here.
externalId
and metadata support. Read more here.rootAssetIds
in files GET /files (using query parameter) and POST /files/list (in request body).partition
in /assets
and /events
to support parallel retrieval. See guide for usage hereintersectsBoundingBox
to the list asset mappings endpoint. The parameter filters asset mappings to the assets where the bounding box intersects (or is contained within) the specified bounding box.rootAssetIds
to list time series endpoint. Returns time series that are linked to an asset that has one of the root assets as an ancestor.List of changes for initial API v1 release in comparison to previous version - API 0.5
externalId
added across resource types. externalId
lets you define a unique ID for a data object. Learn more: External IDsexternalIdPrefix
added as a parameter to the list events, assets and files operations.data
object.limit
, cursor
and nextCursor
parameters.limit
parameter no longer implicitly rounds down requested page size to maximum page size.sourceId
field has been removed from resources. Use externalId
instead of sourceId
+source
to define unique IDs for data objects.offset
and previousCursor
parameters are no longer supported for pagination across resources.root
filter.rootId
field to specify the top element in an asset hierarchy.rootIds
.rootIds
.name
property.boostName
has been removed from the search for assets operation.path
and depth
fields.rootAssetIds
allows for narrowing down events belonging only to list or specified root assets. Supported by Filter and Search APIassetIds
in list files operations now support multiple assets in the same request.fileType
field to mimeType
. The field now requires a MIME formatted string (e.g. text/plain
).uploadedAt
field to uploadedTime
.name
or mimeType
of a file through the update multiple files operation is no longer supported.id
and externalId
of time series. Adding datapoints to time series by name
has been removed.externalId
attribute for time series.externalId
during creation of time series. ExternalId
requires uniqueness across time series.id
and externalId
of the time series.legacyName
on time series creation. Value is required to be unique.id
and externalId
lookup as well retrieval for multiple time series within the same request.id
and externalId
.id
and externalId
. Selecting by name
is no longer available.externalId
.externalId
.boostName
has been removed from search operation.name
have been removed as names are no longer unique identifiers.name
is no longer available.isString
and isStep
attributes is removed. The attributes are not intended to be modified after creation of time series.id
. Use the update multiple time series endpoint instead.name
has been removed. Use externalId
instead.id
from a single time series has been removed. Use retrieve multiple datapoints for multiple time series instead.name
has been removed.name
has been removed.apiKeyId
), if the request used an API key.userId
attribute to serviceAccountId
.permissions
and source
attributes.Projects are used to isolate data in CDF from each other. All objects in CDF belong to a single project, and objects in different projects are generally isolated from each other.
Creates new projects given project details. This functionality is currently only available for Cognite and re-sellers of Cognite Data Fusion. Please contact Cognite Support for more information.
List of new project specifications
required | Array of objects (NewProjectSpec) |
{- "items": [
- {
- "name": "Open Industrial Data",
- "urlName": "publicdata",
- "adminSourceGroupId": "b7c9a5a4-99c2-4785-bed3-5e6ad9a78603",
- "parentProjectUrlName": "administrative-project",
- "oidcConfiguration": {
- "jwksUrl": "string",
- "tokenUrl": "string",
- "issuer": "string",
- "audience": "string",
- "skewMs": 0,
- "accessClaims": [
- {
- "claimName": "string"
}
], - "scopeClaims": [
- {
- "claimName": "string"
}
], - "logClaims": [
- {
- "claimName": "string"
}
], - "isGroupCallbackEnabled": false,
- "identityProviderScope": "string"
}
}
]
}
{- "name": "Open Industrial Data",
- "urlName": "publicdata"
}
The list of all projects that the user has the 'list projects' capability in. The user may not have access to any resources in the listed projects, even if they have access to list the project itself.
{- "items": [
- {
- "urlName": "publicdata"
}
]
}
Retrieves information about a project given the project URL name.
projectName required | string Example: publicdata The CDF project name, equal to the project variable in the server URL. |
const projectInfo = await client.projects.retrieve('publicdata');
{- "name": "Open Industrial Data",
- "urlName": "publicdata",
- "defaultGroupId": 123871937,
- "authentication": {
- "validDomains": [
- "example.com",
- "google.com"
], - "applicationDomains": [
- "console.cognitedata.com",
- "cdfapplication.example.com"
]
}, - "oidcConfiguration": {
- "jwksUrl": "string",
- "tokenUrl": "string",
- "issuer": "string",
- "audience": "string",
- "skewMs": 0,
- "accessClaims": [
- {
- "claimName": "string"
}
], - "scopeClaims": [
- {
- "claimName": "string"
}
], - "logClaims": [
- {
- "claimName": "string"
}
], - "isGroupCallbackEnabled": false,
- "identityProviderScope": "string"
}
}
Updates the project configuration.
Warning: Updating a project will invalidate active sessions within that project.
projectName required | string Example: publicdata The CDF project name, equal to the project variable in the server URL. |
Object with updated project configuration.
required | object (ProjectUpdateObjectDTO) Contains the instructions on how to update the project. Note: azureADConfiguration, oidcConfiguration and oAuth2Configuration are mutually exclusive |
{- "update": {
- "name": {
- "set": "string"
}, - "defaultGroupId": {
- "set": 0
}, - "validDomains": {
- "set": [
- "string"
]
}, - "applicationDomains": {
- "set": [
- "string"
]
}, - "authenticationProtocol": {
- "set": "string"
}, - "azureADConfiguration": {
- "set": {
- "appId": "string",
- "appSecret": "string",
- "tenantId": "string",
- "appResourceId": "string"
}
}, - "oAuth2Configuration": {
- "set": {
- "loginUrl": "string",
- "logoutUrl": "string",
- "tokenUrl": "string",
- "clientId": "string",
- "clientSecret": "string"
}
}, - "oidcConfiguration": {
- "modify": {
- "jwksUrl": {
- "set": "string"
}, - "tokenUrl": {
- "set": "string"
}, - "issuer": {
- "set": "string"
}, - "audience": {
- "set": "string"
}, - "skewMs": {
- "set": 0
}, - "accessClaims": {
- "set": [
- {
- "claimName": "string"
}
]
}, - "scopeClaims": {
- "set": [
- {
- "claimName": "string"
}
]
}, - "logClaims": {
- "set": [
- {
- "claimName": "string"
}
]
}, - "isGroupCallbackEnabled": {
- "set": true
}, - "identityProviderScope": {
- "set": "string"
}
}
}
}
}
{- "name": "Open Industrial Data",
- "urlName": "publicdata",
- "defaultGroupId": 123871937,
- "authentication": {
- "validDomains": [
- "example.com",
- "google.com"
], - "applicationDomains": [
- "console.cognitedata.com",
- "cdfapplication.example.com"
]
}, - "oidcConfiguration": {
- "jwksUrl": "string",
- "tokenUrl": "string",
- "issuer": "string",
- "audience": "string",
- "skewMs": 0,
- "accessClaims": [
- {
- "claimName": "string"
}
], - "scopeClaims": [
- {
- "claimName": "string"
}
], - "logClaims": [
- {
- "claimName": "string"
}
], - "isGroupCallbackEnabled": false,
- "identityProviderScope": "string"
}
}
Groups are used to give principals the capabilities to access CDF resources. One principal can be a member in multiple groups and one group can have multiple members. Note that having more than 20 groups per principal is not supported and may result in login issues.
Creates one or more named groups, each with a set of capabilities.
List of groups to create.
required | Array of objects (GroupSpec) |
{- "items": [
- {
- "name": "Production Engineers",
- "sourceId": "b7c9a5a4-99c2-4785-bed3-5e6ad9a78603",
- "capabilities": [
- {
- "analyticsAcl": {
- "actions": [
- "READ"
], - "scope": {
- "all": { }
}
}
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}
}
]
}
{- "items": [
- {
- "name": "Production Engineers",
- "sourceId": "b7c9a5a4-99c2-4785-bed3-5e6ad9a78603",
- "capabilities": [
- {
- "analyticsAcl": {
- "actions": [
- "READ"
], - "scope": {
- "all": { }
}
}
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "id": 0,
- "isDeleted": false,
- "deletedTime": 0
}
]
}
Deletes the groups with the given IDs.
List of group IDs to delete
items required | Array of integers <int64> non-empty unique [ items <int64 > ] |
{- "items": [
- 23872937137,
- 1238712837,
- 128371973
]
}
{ }
Retrieves a list of groups the asking principal a member of. Principals with groups:list capability can optionally ask for all groups in a project.
all | boolean Default: false Whether to get all groups, only available with the groups:list acl. |
const groups = await client.groups.list({ all: true });
{- "items": [
- {
- "name": "Production Engineers",
- "sourceId": "b7c9a5a4-99c2-4785-bed3-5e6ad9a78603",
- "capabilities": [
- {
- "analyticsAcl": {
- "actions": [
- "READ"
], - "scope": {
- "all": { }
}
}
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "id": 0,
- "isDeleted": false,
- "deletedTime": 0
}
]
}
Manage security categories for a specific project. Security categories can be used to restrict access to a resource. Applying a security category to a resource means that only principals (users or service accounts) that also have this security category can access the resource. To learn more about security categories please read this page.
Creates security categories with the given names. Duplicate names in the request are ignored. If a security category with one of the provided names exists already, then the request will fail and no security categories are created.
List of categories to create
required | Array of objects (SecurityCategorySpecDTO) non-empty |
{- "items": [
- {
- "name": "Guarded by vendor x"
}
]
}
{- "items": [
- {
- "name": "Guarded by vendor x",
- "id": 0
}
]
}
Deletes the security categories that match the provided IDs. If any of the provided IDs does not belong to an existing security category, then the request will fail and no security categories are deleted.
List of security category IDs to delete.
items required | Array of integers <int64> non-empty unique [ items <int64 > ] |
{- "items": [
- 23872937137,
- 1238712837,
- 128371973
]
}
{ }
Retrieves a list of all security categories for a project.
sort | string Default: "ASC" Enum: "ASC" "DESC" Sort descending or ascending. |
cursor | string Cursor to use for paging through results. |
limit | integer <int32> <= 1000 Default: 25 Return up to this many results. Maximum is 1000. Default is 25. |
const securityCategories = await client.securityCategories.list({ sort: 'ASC' });
{- "items": [
- {
- "name": "Guarded by vendor x",
- "id": 0
}
], - "nextCursor": "string"
}
Sessions are used to maintain access to CDF resources for an extended period of time. The methods available to extend a sessions lifetime are client credentials and token exchange. Sessions depend on the project OIDC configuration and may become invalid in the following cases
Project OIDC configuration has been updated through the update project endpoint. This action invalidates all of the project's sessions.
The session was invalidated through the identity provider.
Create sessions
A request containing the information needed to create a session.
Array of CreateSessionWithClientCredentialsRequest (object) or CreateSessionWithTokenExchangeRequest (object) (CreateSessionRequest) = 1 items |
{- "items": [
- {
- "clientId": "string",
- "clientSecret": "string"
}
]
}
{- "items": [
- {
- "id": 0,
- "type": "CLIENT_CREDENTIALS",
- "status": "READY",
- "nonce": "string",
- "clientId": "string"
}
]
}
List all sessions in the current project.
status | string Enum: "ready" "active" "cancelled" "revoked" "access_lost" If given, only sessions with the given status are returned. |
cursor | string Cursor to use for paging through results. |
limit | integer <int32> <= 1000 Default: 25 Return up to this many results. Maximum is 1000. Default is 25. |
{- "items": [
- {
- "id": 0,
- "type": "CLIENT_CREDENTIALS",
- "status": "READY",
- "creationTime": 0,
- "expirationTime": 0,
- "clientId": "string"
}
], - "nextCursor": "string",
- "previousCursor": "string"
}
Retrieves sessions with given IDs. The request will fail if any of the IDs does not belong to an existing session.
List of session IDs to retrieve
required | Array of objects [ 1 .. 1000 ] items |
{- "items": [
- {
- "id": 1
}
]
}
{- "items": [
- {
- "id": 105049194919491,
- "type": "TOKEN_EXCHANGE",
- "status": "ACTIVE",
- "creationTime": 1638795559528,
- "expirationTime": 1638795559628
}
]
}
Revoke access to a session. Revocation of a session may in some cases take up to 1 hour to take effect.
A request containing the information needed to revoke sessions.
Array of objects (RevokeSessionRequest) |
{- "items": [
- {
- "id": 0
}
]
}
{- "items": [
- {
- "id": 0,
- "type": "CLIENT_CREDENTIALS",
- "status": "REVOKED",
- "creationTime": 1638795554528,
- "expirationTime": 1638795554528,
- "clientId": "client-123"
}
]
}
Inspect CDF access granted to an IdP issued token
{- "subject": "string",
- "projects": [
- {
- "projectUrlName": "string",
- "groups": [
- 0
]
}
], - "capabilities": [
- {
- "groupsAcl": {
- "actions": [
- "LIST"
], - "scope": {
- "all": { }
}
}, - "projectScope": {
- "allProjects": { }
}
}
]
}
User profiles is an authoritative source of core user profile information (email, name, job title, etc.) for principals based on data from the identity provider configured for the CDF project.
User profiles are first created (usually within a few seconds) when a principal issues a request against a CDF API. We currently don't support automatic exchange of user identity information between the identity provider and CDF, but the profile data is updated regularly with the latest data from the identity provider for the principals issuing requests against a CDF API.
Note that the user profile data is mutable, and any updates in the external identity
provider may also cause updates in this API. Therefore, you cannot use profile data,
for example a user's email, to uniquely identify a principal. The exception is the
userIdentifier
property which is guaranteed to be immutable.
Retrieves the user profile of the principal issuing the request. If a principal doesn't have a user profile, you get a not found (404) response code.
{- "userIdentifier": "abcd",
- "givenName": "Jane",
- "surname": "Doe",
- "email": "jane.doe@example.com",
- "displayName": "Jane Doe",
- "jobTitle": "Software Engineer",
- "lastUpdatedTime": 0
}
List all user profiles in the current project. This operation supports pagination by cursor. The results are ordered alphabetically by name.
limit | integer [ 1 .. 1000 ] Default: 25 Limits the number of results to be returned. The server returns no more than 1000 results even if the specified limit is larger. The default limit is 25. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
{- "items": [
- {
- "userIdentifier": "abcd",
- "givenName": "Jane",
- "surname": "Doe",
- "email": "jane.doe@example.com",
- "displayName": "Jane Doe",
- "jobTitle": "Software Engineer",
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
Retrieve one or more user profiles indexed by the user identifier in the same CDF project.
Specify a maximum of 1000 unique IDs.
Array of objects (UserIdentifier) |
{- "items": [
- {
- "userIdentifier": "abcd"
}
]
}
{- "items": [
- {
- "userIdentifier": "abcd",
- "givenName": "Jane",
- "surname": "Doe",
- "email": "jane.doe@example.com",
- "displayName": "Jane Doe",
- "jobTitle": "Software Engineer",
- "lastUpdatedTime": 0
}
]
}
Search user profiles in the current project. The result set ordering and match criteria threshold may change over time. This operation does not support pagination.
Query for user profile search.
object | |
limit | integer <int32> [ 1 .. 1000 ] Default: 25 <- Limits the maximum number of results to be returned by single request. The default is 25. |
{- "search": {
- "name": "string"
}, - "limit": 25
}
{- "items": [
- {
- "userIdentifier": "abcd",
- "givenName": "Jane",
- "surname": "Doe",
- "email": "jane.doe@example.com",
- "displayName": "Jane Doe",
- "jobTitle": "Software Engineer",
- "lastUpdatedTime": 0
}
]
}
The assets resource type stores digital representations of objects or groups of objects from the physical world. Assets are organized in hierarchies. For example, a water pump asset can be a part of a subsystem asset on an oil platform asset.
Rate and concurrency limits apply to some of the endpoints. If a request exceeds one of the limits,
it will be throttled with a 429: Too Many Requests
response. More on limit types
and how to avoid being throttled is described
here.
Following limits apply to the List assets, Filter assets, Aggregate assets and Search assets endpoints. These limits apply to all endpoints simultaneously, i.e. requests made to different endpoints are counted together. Please note the additional conditions that apply to the Aggregate assets endpoint, as this endpoint provides the most resource-consuming operations.
Limit | Per project | Per user (identity) |
---|---|---|
Rate | 30 rps total out of which no more than 15 rps to Aggregate |
20 rps out of which no more than 10 rps to Aggregate |
Concurrency | 15 parallel requests out of which no more than 6 to Aggregate |
10 parallel requests out of which no more than 4 to Aggregate |
The aggregation API lets you compute aggregated results on assets, such as getting the count of all assets in a project, checking different names and descriptions of assets in your project, etc.
Filters behave the same way as for the Filter assets
endpoint.
In text properties, the values are aggregated in a case-insensitive manner.
aggregateFilter
works similarly to advancedFilter
but always applies to aggregate properties.
For instance, in case of an aggregation for the source
property, only the values (aka buckets) of the source
property can be filtered out.
This endpoint is meant for data analytics/exploration usage and is not suitable for high load data retrieval usage.
It is a subject of the new throttling schema (limited request rate and concurrency).
Please check Assets resource description for more information.
aggregate | string Value: "count" Type of aggregation to apply.
| ||||||||||||||||||||||||||||
(BoolFilter (and (object) or or (object) or not (object))) or (LeafFilter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or search (object))) A filter DSL (Domain Specific Language) to define advanced filter queries. See more information about filtering DSL here. Supported properties:
Note: Filtering on the | |||||||||||||||||||||||||||||
object (Filter) Filter on assets with strict matching. |
{- "aggregate": "count",
- "advancedFilter": {
- "or": [
- {
- "not": {
- "and": [
- {
- "equals": {
- "property": [
- "metadata",
- "asset_type"
], - "value": "gas pump"
}
}, - {
- "in": {
- "property": [
- "source"
], - "values": [
- "blueprint",
- "inventory"
]
}
}, - {
- "range": {
- "property": [
- "dataSetId"
], - "gte": 1,
- "lt": 10
}
}
]
}
}, - {
- "and": [
- {
- "containsAny": {
- "property": [
- "labels"
], - "values": [
- "pump",
- "cooler"
]
}
}, - {
- "equals": {
- "property": [
- "parentId"
], - "value": 95867294876
}
}
]
}, - {
- "search": {
- "property": [
- "description"
], - "value": "My favorite pump"
}
}
]
}, - "filter": {
- "name": "string",
- "parentIds": [
- 1
], - "parentExternalIds": [
- "my.known.id"
], - "rootIds": [
- {
- "id": 1
}
], - "assetSubtreeIds": [
- {
- "id": 1
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "root": true,
- "externalIdPrefix": "my.known.prefix",
- "labels": {
- "containsAny": [
- {
- "externalId": "my.known.id"
}
]
}, - "geoLocation": {
- "relation": "INTERSECTS",
- "shape": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}
}
}
}
{- "items": [
- {
- "count": 10
}
]
}
You can create a maximum of 1000 assets per request.
List of the assets to create. You can create a maximum of 1000 assets per request.
required | Array of objects (DataExternalAssetItem) [ 1 .. 1000 ] items |
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "parentId": 1,
- "parentExternalId": "my.known.id",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}
}
]
}
{- "items": [
- {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}
]
}
Delete assets. By default, recursive=false
and the request would fail if attempting to delete assets that are referenced as parent by other assets. To delete such assets and all its descendants, set recursive to true. The limit of the request does not include the number of descendants that are deleted.
required | Array of AssetInternalId (object) or AssetExternalId (object) (AssetIdEither) [ 1 .. 1000 ] items |
recursive | boolean Default: false Recursively delete all asset subtrees under the specified IDs. |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "recursive": false,
- "ignoreUnknownIds": false
}
{ }
Retrieve a list of assets in the same project. This operation supports pagination by cursor. Apply Filtering and Advanced filtering criteria to select a subset of assets.
Advanced filter lets you create complex filtering expressions that combine simple operations,
such as equals
, prefix
, exists
, etc., using boolean operators and
, or
, and not
.
It applies to basic fields as well as metadata.
See the advancedFilter
attribute in the example.
See more information about filtering DSL here.
Leaf filter | Supported fields | Description |
---|---|---|
containsAll |
Array type fields | Only includes results which contain all of the specified values. {"containsAll": {"property": ["property"], "values": [1, 2, 3]}} |
containsAny |
Array type fields | Only includes results which contain at least one of the specified values. {"containsAny": {"property": ["property"], "values": [1, 2, 3]}} |
equals |
Non-array type fields | Only includes results that are equal to the specified value. {"equals": {"property": ["property"], "value": "example"}} |
exists |
All fields | Only includes results where the specified property exists (has value). {"exists": {"property": ["property"]}} |
in |
Non-array type fields | Only includes results that are equal to one of the specified values. {"in": {"property": ["property"], "values": [1, 2, 3]}} |
prefix |
String type fields | Only includes results which start with the specified value. {"prefix": {"property": ["property"], "value": "example"}} |
range |
Non-array type fields | Only includes results that fall within the specified range. {"range": {"property": ["property"], "gt": 1, "lte": 5}} Supported operators: gt , lt , gte , lte |
search |
["name"] , ["description"] |
Introduced to provide functional parity with /assets/search endpoint. {"search": {"property": ["property"], "value": "example"}} |
The search
leaf filter provides functional parity with the /assets/search
endpoint.
It's available only for the ["description"]
and ["name"]
properties. When specifying only this filter with no explicit ordering,
behavior is the same as of the /assets/search/
endpoint without specifying filters.
Explicit sorting overrides the default ordering by relevance.
It's possible to use the search
leaf filter as any other leaf filter for creating complex queries.
See the search
filter in the advancedFilter
attribute in the example.
and
and or
clauses must have at least one elementproperty
array of each leaf filter has the following limitations:containsAll
, containsAny
, and in
filter values
array size must be in the range [1, 100]containsAll
, containsAny
, and in
filter values
array must contain elements of a primitive type (number, string)range
filter must have at least one of gt
, gte
, lt
, lte
attributes.
But gt
is mutually exclusive to gte
, while lt
is mutually exclusive to lte
.
At least one of the bounds must be specified.gt
, gte
, lt
, lte
in the range
filter must be a primitive valuesearch
filter value
must not be blank and the length must be in the range [1, 128]externalId
- 255name
- 128 for the search
filter and 255 for other filtersdescription
- 128 for the search
filter and 255 for other filterslabels
item - 255source
- 128metadata
key - 128By default, assets are sorted by id
in the ascending order.
Use the search
leaf filter to sort the results by relevance.
Sorting by other fields can be explicitly requested. The order
field is optional
and defaults to desc
for _score_
and asc
for all other fields.
The nulls
field is optional and defaults to auto
. auto
is translated to
last
for the asc
order and to first
for the desc
order by the service.
Partitions are done independently of sorting; there's no guarantee of the sort order between elements from different partitions.
See the sort
attribute in the example.
In case the nulls
attribute has the auto
value or the attribute isn't specified,
null (missing) values are considered to be bigger than any other values.
They are placed last when sorting in the asc
order and first when sorting in desc
.
Otherwise, missing values are placed according to the nulls
attribute (last or first), and their placement doesn't depend on the order
value.
Values, such as empty strings, aren't considered as nulls.
Use a special sort property _score_
when sorting by relevance.
The more filters a particular asset matches, the higher its score is. This can be useful,
for example, when building UIs. Let's assume we want exact matches to be be displayed above matches by
prefix as in the request below. An asset named pump
will match both equals
and prefix
filters and, therefore, have higher score than assets with names like pump valve
that match only prefix
filter.
"advancedFilter" : {
"or" : [
{
"equals": {
"property": ["name"],
"value": "pump"
}
},
{
"prefix": {
"property": ["name"],
"value": "pump"
}
}
]
},
"sort": [
{
"property" : ["_score_"]
}
]
This endpoint is meant for data analytics/exploration usage and is not suitable for high load data retrieval usage. It is a subject of the new throttling schema (limited request rate and concurrency). Please check Assets resource description for more information.
object (Filter) Filter on assets with strict matching. | |||||||||||||||||||||||||||||
(BoolFilter (and (object) or or (object) or not (object))) or (LeafFilter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or search (object))) A filter DSL (Domain Specific Language) to define advanced filter queries. See more information about filtering DSL here. Supported properties:
Note: Filtering on the | |||||||||||||||||||||||||||||
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Limits the number of results to return. | ||||||||||||||||||||||||||||
Array of objects (AssetSortProperty) [ 1 .. 2 ] items Sort by array of selected properties. | |||||||||||||||||||||||||||||
cursor | string | ||||||||||||||||||||||||||||
aggregatedProperties | Array of strings (AggregatedProperty) Items Enum: "childCount" "path" "depth" Set of aggregated properties to include | ||||||||||||||||||||||||||||
partition | string (Partition) Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
{- "filter": {
- "name": "string",
- "parentIds": [
- 1
], - "parentExternalIds": [
- "my.known.id"
], - "rootIds": [
- {
- "id": 1
}
], - "assetSubtreeIds": [
- {
- "id": 1
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "root": true,
- "externalIdPrefix": "my.known.prefix",
- "labels": {
- "containsAny": [
- {
- "externalId": "my.known.id"
}
]
}, - "geoLocation": {
- "relation": "INTERSECTS",
- "shape": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}
}
}, - "advancedFilter": {
- "or": [
- {
- "not": {
- "and": [
- {
- "equals": {
- "property": [
- "metadata",
- "asset_type"
], - "value": "gas pump"
}
}, - {
- "in": {
- "property": [
- "source"
], - "values": [
- "blueprint",
- "inventory"
]
}
}, - {
- "range": {
- "property": [
- "dataSetId"
], - "gte": 1,
- "lt": 10
}
}
]
}
}, - {
- "and": [
- {
- "containsAny": {
- "property": [
- "labels"
], - "values": [
- "pump",
- "cooler"
]
}
}, - {
- "equals": {
- "property": [
- "parentId"
], - "value": 95867294876
}
}
]
}, - {
- "search": {
- "property": [
- "description"
], - "value": "My favorite pump"
}
}
]
}, - "limit": 100,
- "sort": [
- {
- "property": [
- "createdTime"
], - "order": "desc"
}, - {
- "property": [
- "metadata",
- "customMetadataKey"
], - "nulls": "first"
}
], - "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo",
- "aggregatedProperties": [
- "childCount"
], - "partition": "1/10"
}
{- "items": [
- {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}
], - "nextCursor": "string"
}
List all assets, or only the assets matching the specified query.
This endpoint is meant for data analytics/exploration usage and is not suitable for high load data retrieval usage. It is a subject of the new throttling schema (limited request rate and concurrency). Please check Assets resource description for more information.
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
includeMetadata | boolean Default: true Whether the metadata field should be returned or not. |
name | string (AssetName) [ 1 .. 140 ] characters The name of the asset. |
parentIds | string <jsonArray(int64)> (JsonArrayInt64) Example: parentIds=[363848954441724, 793045462540095, 1261042166839739] List only assets that have one of the parentIds as a parent. The parentId for root assets is null. |
parentExternalIds | string <jsonArray(string)> (JsonArrayString) Example: parentExternalIds=[externalId_1, externalId_2, externalId_3] List only assets that have one of the parentExternalIds as a parent. The parentId for root assets is null. |
rootIds | string <jsonArray(int64)> (JsonArrayInt64) Deprecated Example: rootIds=[363848954441724, 793045462540095, 1261042166839739] This parameter is deprecated. Use assetSubtreeIds instead. List only assets that have one of the rootIds as a root asset. A root asset is its own root asset. |
assetSubtreeIds | string <jsonArray(int64)> (JsonArrayInt64) Example: assetSubtreeIds=[363848954441724, 793045462540095, 1261042166839739] List only assets that are in a subtree rooted at any of these assetIds (including the roots given). If the total size of the given subtrees exceeds 100,000 assets, an error will be returned. |
assetSubtreeExternalIds | string <jsonArray(string)> (JsonArrayString) Example: assetSubtreeExternalIds=[externalId_1, externalId_2, externalId_3] List only assets that are in a subtree rooted at any of these assetExternalIds. If the total size of the given subtrees exceeds 100,000 assets, an error will be returned. |
source | string <= 128 characters The source of the asset, for example which database it's from. |
root | boolean Default: false Whether the filtered assets are root assets, or not. Set to True to only list root assets. |
minCreatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxCreatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
minLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
externalIdPrefix | string (CogniteExternalIdPrefix) <= 255 characters Example: externalIdPrefix=my.known.prefix Filter by this (case-sensitive) prefix for the external ID. |
partition | string Example: partition=1/10 Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
const assets = await client.assets.list({ filter: { name: '21PT1019' } });
{- "items": [
- {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}
], - "nextCursor": "string"
}
Retrieve an asset by its ID. If you want to retrieve assets by externalIds, use Retrieve assets instead.
id required | integer <int64> (CogniteInternalId) [ 1 .. 9007199254740991 ] A server-generated ID for the object. |
const assets = await client.assets.retrieve([{id: 123}, {externalId: 'abc'}]);
{- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}
Retrieve assets by IDs or external IDs. If you specify to get aggregates, then be aware that the aggregates are eventually consistent.
All provided IDs and external IDs must be unique.
required | Array of AssetInternalId (object) or AssetExternalId (object) (AssetIdEither) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
aggregatedProperties | Array of strings (AggregatedProperty) Items Enum: "childCount" "path" "depth" Set of aggregated properties to include |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false,
- "aggregatedProperties": [
- "childCount"
]
}
{- "items": [
- {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}
]
}
Fulltext search for assets based on result relevance. Primarily meant for human-centric use-cases, not for programs, since matching and ordering may change over time. Additional filters can also be specified. This operation doesn't support pagination.
This endpoint is meant for data analytics/exploration usage and is not suitable for high load data retrieval usage. It is a subject of the new throttling schema (limited request rate and concurrency). Please check Assets resource description for more information.
Search query
object (Filter) Filter on assets with strict matching. | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Limits the number of results to return. |
object (Search) Fulltext search for assets. Primarily meant for for human-centric use-cases, not for programs. The query parameter uses a different search algorithm than the deprecated name and description parameters, and will generally give much better results. |
{- "filter": {
- "parentIds": [
- 1293812938,
- 293823982938
]
}, - "search": {
- "name": "flow",
- "description": "upstream"
}
}
{- "items": [
- {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}
]
}
Update the attributes of assets.
All provided IDs and external IDs must be unique. Fields that aren't included in the request aren't changed.
required | Array of AssetChangeById (object) or AssetChangeByExternalId (object) (AssetChange) [ 1 .. 1000 ] items |
{- "items": [
- {
- "update": {
- "externalId": {
- "set": "my.known.id"
}, - "name": {
- "set": "string"
}, - "description": {
- "set": "string"
}, - "dataSetId": {
- "set": 0
}, - "metadata": {
- "set": {
- "key1": "value1",
- "key2": "value2"
}
}, - "source": {
- "set": "string"
}, - "parentId": {
- "set": 1
}, - "parentExternalId": {
- "set": "my.known.id"
}, - "labels": {
- "add": [
- {
- "externalId": "my.known.id"
}
], - "remove": [
- {
- "externalId": "my.known.id"
}
]
}, - "geoLocation": {
- "set": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}
}
}, - "id": 1
}
]
}
{- "items": [
- {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}
]
}
A time series consists of a sequence of data points connected to a single asset. For example, a water pump asset can have a temperature time series that records a data point in units of °C every second.
A single asset can have several time series. The water pump could have additional time series measuring pressure within the pump, rpm, flow volume, power consumption, and more.Time series store data points as either numbers or strings. This is controlled by the is_string flag on the time series object. Numerical data points can be aggregated before they are returned from a query (e.g., to find the average temperature for a day). String data points, on the other hand, can't be aggregated by CDF but can store arbitrary information like states (e.g., “open”/”closed”) or more complex information (JSON).
Cognite stores discrete data points, but the underlying
process measured by the data points can vary continuously. When interpolating
between data points, we can either assume that each value stays the same until
the next measurement or linearly changes between the two measurements.
The isStep
flag controls this on the time series object. For example,
if we estimate the average over a time containing two data points, the average
will either be close to the first (isStep
) or close to the mean of the two (not
isStep
).
A data point stores a single piece of information, a number or a string, associated with a specific time. Data points are identified by their timestamps, measured in milliseconds since the unix epoch -- 00:00:00.000, January 1st, 1970. The time series service accepts timestamps in the range from 00:00:00.000, January 1st, 1900 through 23:59:59.999, December 31st, 2099 (in other words, every millisecond in the two centuries from 1900 to but not including 2100). Negative timestamps are used to define dates before 1970. Milliseconds is the finest time resolution supported by CDF, i.e., fractional milliseconds are not supported. Leap seconds are not counted.
Numerical data points can be aggregated before they are retrieved from CDF. This allows for faster queries by reducing the amount of data transferred. You can aggregate data points by specifying one or more aggregates (e.g., average, minimum, maximum) as well as the time granularity over which the aggregates should be applied (e.g., “1h” for one hour).
Aggregates are aligned to the start time modulo the granularity unit. For example, if you ask for daily average temperatures since Monday afternoon last week, the first aggregated data point will contain averages for Monday, the second for Tuesday, etc. Determining aggregate alignment without considering data point timestamps allows CDF to pre-calculate aggregates (e.g., to quickly return daily average temperatures for a year). Consequently, aggregating over 60 minutes can return a different result than aggregating over 1 hour because the two queries will be aligned differently. Asset references obtained from a time series - through its asset ID - may be invalid simply by the non-transactional nature of HTTP. They are maintained in an eventually consistent manner.
The aggregation API allows you to compute aggregated results from a set of time series, such as
getting the number of time series in a project or checking what assets the different time series
in your project are associated with (along with the number of time series for each asset).
By specifying filter
and/or advancedFilter
, the aggregation will take place only over those
time series that match the filters. filter
and advancedFilter
behave the same way as in the
list
endpoint.
aggregate
field is not specified the request body, is to return the
number of time series that match the filters (if any), which is the same behavior as when the
aggregate
field is set to count
.
The following requests will both return the total number of
time series whose name
begins with pump
:
{
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
and
{
"aggregate": "count",
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The response might be:
{"items": [{"count": 42}]}
aggregate
to uniqueValues
and specifying a property in properties
(this field is an array, but currently only supports one property) will
return all unique values (up to a maximum of 1000) that are taken on by that property
across all the time series that match the filters, as well as the number of time series that
have each of those property values.
This example request finds all the unique asset ids that are
referenced by the time series in your project whose name
begins with pump
:
{
"aggregate": "uniqueValues",
"properties": [{"property": ["assetId"]}],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The response might be the following, saying that 23 time series are associated with asset 18 and 107 time series are associated with asset 76:
{
"items": [
{"values": ["18"], "count": 23},
{"values": ["76"], "count": 107}
]
}
aggregate
to cardinalityValues
will instead return the approximate number of
distinct values that are taken on by the given property among the matching time series.
Example request:
{
"aggregate": "cardinalityValues",
"properties": [{"property": ["assetId"]}],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The result is likely exact when the set of unique values is small. In this example, there are likely two distinct asset ids among the matching time series:
{"items": [{"count": 2}]}
aggregate
to uniqueProperties
will return the set of unique properties whose property
path begins with path
(which can currently only be ["metadata"]
) that are contained in the time series that match the filters.
Example request:
{
"aggregate": "uniqueProperties",
"path": ["metadata"],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The result contains all the unique metadata keys in the time series whose name
begins with
pump
, and the number of time series that contains each metadata key:
{
"items": [
{"values": [{"property": ["metadata", "tag"]}], "count": 43},
{"values": [{"property": ["metadata", "installationDate"]}], "count": 97}
]
}
aggregate
to cardinalityProperties
will instead return the approximate number of
different property keys whose path begins with path
(which can currently only be ["metadata"]
, meaning that this can only be used to count the approximate number of distinct metadata keys among the matching time series).
Example request:
{
"aggregate": "cardinalityProperties",
"path": ["metadata"],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The result is likely exact when the set of unique values is small. In this example, there are likely two distinct metadata keys among the matching time series:
{"items": [{"count": 2}]}
The aggregateFilter
field may be specified if aggregate
is set to cardinalityProperties
or uniqueProperties
. The structure of this field is similar to that of advancedFilter
, except that the set of leaf filters is smaller (in
, prefix
, and range
), and that none of the leaf filters specify a property. Unlike advancedFilter
, which is applied before the aggregation (in order to restrict the set of time series that the aggregation operation should be applied to), aggregateFilter
is applied after the initial aggregation has been performed, in order to restrict the set of results.
aggregateFilter
.
When aggregate
is set to uniqueProperties
, the result set contains a number of property paths, each with an associated count that shows how many time series contained that property (among those time series that matched the filter
and advancedFilter
, if they were specified) . If aggregateFilter
is specified, it will restrict the property paths included in the output. Let us add an aggregateFilter
to the uniqueProperties
example from above:
{
"aggregate": "uniqueProperties",
"path": ["metadata"],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}},
"aggregateFilter": {"prefix": {"value": "t"}}
}
Now, the result only contains those metadata properties whose key begins with t
(but it will be the same set of metadata properties that begin with t
as in the original query without aggregateFilter
, and the counts will be the same):
{
"items": [
{"values": [{"property": ["metadata", "tag"]}], "count": 43}
]
}
Similarly, adding aggregateFilter
to cardinalityProperties
will return the approximate number of properties whose property key matches aggregateFilter
from those time series matching the filter
and advancedFilter
(or from all time series if neither filter
nor aggregateFilter
are specified):
{
"aggregate": "cardinalityProperties",
"path": ["metadata"],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}},
"aggregateFilter": {"prefix": {"value": "t"}}
}
As we saw above, only one property matches:
{"items": [{"count": 1}]}
Note that aggregateFilter
is also accepted when aggregate
is set to cardinalityValues
or cardinalityProperties
. For those aggregations, the effect of any aggregateFilter
could also be achieved via a similar advancedFilter
. However, aggregateFilter
is not accepted when aggregate
is omitted or set to count
.
Rate and concurrency limits apply this endpoint. If a request exceeds one of the limits,
it will be throttled with a 429: Too Many Requests
response. More on limit types
and how to avoid being throttled is described
here.
Limit | Per project | Per user (identity) |
---|---|---|
Rate | 15 requests per second | 10 requests per second |
Concurrency | 6 concurrent requests | 4 concurrent requests |
Aggregates the time series that match the given criteria.
(Boolean filter (and (object) or or (object) or not (object))) or (Leaf filter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or search (object))) (TimeSeriesFilterLanguage) A filter DSL (Domain Specific Language) to define advanced filter queries. At the top level, an | |
object (Filter) | |
(Boolean filter (and (object) or or (object) or not (object))) or (Leaf filter (in (object) or range (object) or prefix (object))) (TimeSeriesAggregateFilter) A filter DSL (Domain Specific Language) to define aggregate filters. | |
aggregate | string Value: "count" The |
{- "advancedFilter": {
- "or": [
- {
- "not": {
- "and": [
- {
- "equals": {
- "property": [
- "metadata",
- "manufacturer"
], - "value": "acme"
}
}, - {
- "in": {
- "property": [
- "name"
], - "values": [
- "pump-1-temperature",
- "motor-9-temperature"
]
}
}, - {
- "range": {
- "property": [
- "dataSetId"
], - "gte": 1,
- "lt": 10
}
}
]
}
}, - {
- "and": [
- {
- "equals": {
- "property": [
- "assetId"
], - "value": 1234
}
}, - {
- "equals": {
- "property": [
- "description"
], - "value": "Temperature in Celsius"
}
}
]
}
]
}, - "filter": {
- "name": "string",
- "unit": "string",
- "isString": true,
- "isStep": true,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "assetExternalIds": [
- "my.known.id"
], - "rootAssetIds": [
- 343099548723932,
- 88483999203217
], - "assetSubtreeIds": [
- {
- "id": 1
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "externalIdPrefix": "my.known.prefix",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}
}, - "aggregateFilter": {
- "and": [
- { }
]
}, - "aggregate": "count"
}
{- "items": [
- {
- "count": 0
}
]
}
Creates one or more time series.
required | Array of objects (PostTimeSeriesMetadataDTO) [ 1 .. 1000 ] items |
{- "items": [
- {
- "externalId": "string",
- "name": "string",
- "legacyName": "string",
- "isString": false,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "unit": "string",
- "assetId": 1,
- "isStep": false,
- "description": "string",
- "securityCategories": [
- 0
], - "dataSetId": 1
}
]
}
{- "items": [
- {
- "id": 1,
- "externalId": "string",
- "name": "string",
- "isString": true,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "unit": "string",
- "assetId": 1,
- "isStep": true,
- "description": "string",
- "securityCategories": [
- 0
], - "dataSetId": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Delete data points from time series.
The list of delete requests to perform.
required | Array of QueryWithInternalId (object) or QueryWithExternalId (object) (DatapointsDeleteRequest) [ 1 .. 10000 ] items List of delete filters. |
{- "items": [
- {
- "inclusiveBegin": 1638795554528,
- "exclusiveEnd": 1638795554528,
- "id": 1
}
]
}
{ }
Deletes the time series with the specified IDs and their data points.
Specify a list of the time series to delete.
required | Array of QueryWithInternalId (object) or QueryWithExternalId (object) [ 1 .. 1000 ] items unique List of ID objects. |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{ }
The advancedFilter
field lets you create complex filtering expressions that combine simple operations,
such as equals
, prefix
, and exists
, by using the Boolean operators and
, or
, and not
.
Filtering applies to basic fields as well as metadata. See the advancedFilter
syntax in the request example.
Leaf filter | Supported fields | Description and example |
---|---|---|
containsAll |
Array type fields | Only includes results which contain all of the specified values. {"containsAll": {"property": ["property"], "values": [1, 2, 3]}} |
containsAny |
Array type fields | Only includes results which contain at least one of the specified values. {"containsAny": {"property": ["property"], "values": [1, 2, 3]}} |
equals |
Non-array type fields | Only includes results that are equal to the specified value. {"equals": {"property": ["property"], "value": "example"}} |
exists |
All fields | Only includes results where the specified property exists (has a value). {"exists": {"property": ["property"]}} |
in |
Non-array type fields | Only includes results that are equal to one of the specified values. {"in": {"property": ["property"], "values": [1, 2, 3]}} |
prefix |
String type fields | Only includes results which start with the specified text. {"prefix": {"property": ["property"], "value": "example"}} |
range |
Non-array type fields | Only includes results that fall within the specified range. {"range": {"property": ["property"], "gt": 1, "lte": 5}} Supported operators: gt , lt , gte , lte |
search |
["name"] and ["description"] |
Introduced to provide functional parity with the /timeseries/search endpoint. {"search": {"property": ["property"], "value": "example"}} |
Property | Type |
---|---|
["description"] |
string |
["externalId"] |
string |
["metadata", "<someCustomKey>"] |
string |
["name"] |
string |
["unit"] |
string |
["assetId"] |
number |
["assetRootId"] |
number |
["createdTime"] |
number |
["dataSetId"] |
number |
["id"] |
number |
["lastUpdatedTime"] |
number |
["isStep"] |
Boolean |
["isString"] |
Boolean |
["accessCategories"] |
array of strings |
["securityCategories"] |
array of numbers |
and
and or
clauses must have at least one element (and at most 99, since each element counts
towards the total clause limit, and so does the and
/or
clause itself).property
array of each leaf filter has the following limitations:property
array must match one of the existing properties (static top-level property or dynamic metadata property).containsAll
, containsAny
, and in
filter values
array size must be in the range [1, 100].containsAll
, containsAny
, and in
filter values
array must contain elements of number or string type (matching the type of the given property).range
filter must have at lest one of gt
, gte
, lt
, lte
attributes.
But gt
is mutually exclusive to gte
, while lt
is mutually exclusive to lte
.gt
, gte
, lt
, lte
in the range
filter must be of number or string type (matching the type of the given property).search
filter value
must not be blank, and the length must be in the range [1, 128], and there
may be at most two search
filters in the entire filter query.value
of a leaf filter that is applied to a string property is 256.By default, time series are sorted by their creation time in ascending order.
Sorting by another property or by several other properties can be explicitly requested via the
sort
field, which must contain a list
of one or more sort specifications. Each sort specification indicates the property
to sort on
and, optionally, the order
in which to sort (defaults to asc
). If multiple sort specifications are
supplied, the results are sorted on the first property, and those with the same value for the first
property are sorted on the second property, and so on.
Partitioning is done independently of sorting; there is no guarantee of sort order between elements from different partitions.
In case the nulls
field has the auto
value, or the field isn't specified, null (missing) values
are considered bigger than any other values. They are placed last when sorting in the asc
order and first in the desc
order. Otherwise, missing values are placed according to
the nulls
field (last
or first
), and their placement won't depend on the order
field. Note that the number zero, empty strings, and empty lists are all considered
not null.
{
"sort": [
{
"property" : ["createdTime"],
"order": "desc",
"nulls": "last"
},
{
"property" : ["metadata", "<someCustomKey>"]
}
]
}
You can sort on the following properties:
Property |
---|
["assetId"] |
["createdTime"] |
["dataSetId"] |
["description"] |
["externalId"] |
["lastUpdatedTime"] |
["metadata", "<someCustomKey>"] |
["name"] |
The sort
array must contain 1 to 2 elements.
object (Filter) | |
(Boolean filter (and (object) or or (object) or not (object))) or (Leaf filter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or search (object))) (TimeSeriesFilterLanguage) A filter DSL (Domain Specific Language) to define advanced filter queries. At the top level, an | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Return up to this many results. |
cursor | string |
partition | string (Partition) Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
Array of objects (TimeSeriesSortItem) [ 1 .. 2 ] items Sort by array of selected properties. |
{- "filter": {
- "name": "string",
- "unit": "string",
- "isString": true,
- "isStep": true,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "assetExternalIds": [
- "my.known.id"
], - "rootAssetIds": [
- 343099548723932,
- 88483999203217
], - "assetSubtreeIds": [
- {
- "id": 1
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "externalIdPrefix": "my.known.prefix",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}
}, - "advancedFilter": {
- "or": [
- {
- "not": {
- "and": [
- {
- "equals": {
- "property": [
- "metadata",
- "manufacturer"
], - "value": "acme"
}
}, - {
- "in": {
- "property": [
- "name"
], - "values": [
- "pump-1-temperature",
- "motor-9-temperature"
]
}
}, - {
- "range": {
- "property": [
- "dataSetId"
], - "gte": 1,
- "lt": 10
}
}
]
}
}, - {
- "and": [
- {
- "equals": {
- "property": [
- "assetId"
], - "value": 1234
}
}, - {
- "equals": {
- "property": [
- "description"
], - "value": "Temperature in Celsius"
}
}
]
}
]
}, - "limit": 100,
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo",
- "partition": "1/10",
- "sort": [
- {
- "property": [
- "string"
], - "order": "asc",
- "nulls": "first"
}
]
}
{- "items": [
- {
- "id": 1,
- "externalId": "string",
- "name": "string",
- "isString": true,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "unit": "string",
- "assetId": 1,
- "isStep": true,
- "description": "string",
- "securityCategories": [
- 0
], - "dataSetId": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
Insert data points into a time series. You can do this for multiple time series. If you insert a data point with a timestamp that already exists, it will be overwritten with the new value.
The datapoints to insert.
required | Array of DatapointsWithInternalId (object) or DatapointsWithExternalId (object) (DatapointsPostDatapoint) [ 1 .. 10000 ] items |
{- "items": [
- {
- "datapoints": [
- {
- "timestamp": -2208988800000,
- "value": 0
}
], - "id": 1
}
]
}
{ }
List time series. Use nextCursor
to paginate through the results.
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Limits the number of results to return. CDF returns a maximum of 1000 results even if you specify a higher limit. |
includeMetadata | boolean Default: true Whether the metadata field should be returned or not. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
partition | string Example: partition=1/10 Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
assetIds | string <jsonArray(int64)> (JsonArrayInt64) Example: assetIds=[363848954441724, 793045462540095, 1261042166839739] Gets the time series related to the assets. The format is a list of IDs serialized as a JSON array(int64). Takes [ 1 .. 100 ] unique items. |
rootAssetIds | string <jsonArray(int64)> (JsonArrayInt64) Example: rootAssetIds=[363848954441724, 793045462540095, 1261042166839739] Only includes time series that have a related asset in a tree rooted at any of these root |
externalIdPrefix | string (CogniteExternalIdPrefix) <= 255 characters Example: externalIdPrefix=my.known.prefix Filter by this (case-sensitive) prefix for the external ID. |
const timeseries = await client.timeseries.list({ filter: { assetIds: [1, 2] }});
{- "items": [
- {
- "id": 1,
- "externalId": "string",
- "name": "string",
- "isString": true,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "unit": "string",
- "assetId": 1,
- "isStep": true,
- "description": "string",
- "securityCategories": [
- 0
], - "dataSetId": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
Retrieves a list of data points from multiple time series in a project. This operation supports aggregation and pagination. Learn more about aggregation.
Note: when start
isn't specified in the top level and for an individual item, it will default to epoch 0, which is 1 January, 1970, thus
excluding potential existent data points before 1970. start
needs to be specified as a negative number to get data points before 1970.
Specify parameters to query for multiple data points. If you omit fields in individual data point query items, the top-level field values are used. For example, you can specify a default limit for all items by setting the top-level limit field. If you request aggregates, only the aggregates are returned. If you don't request any aggregates, all data points are returned.
required | Array of QueryWithInternalId (object) or QueryWithExternalId (object) (DatapointsQuery) [ 1 .. 100 ] items |
integer or string (TimestampOrStringStart) Get datapoints starting from, and including, this time. The format is N[timeunit]-ago where timeunit is w,d,h,m,s. Example: '2d-ago' gets datapoints that are up to 2 days old. You can also specify time in milliseconds since epoch. Note that for aggregates, the start time is rounded down to a whole granularity unit (in UTC timezone). Daily granularities (d) are rounded to 0:00 AM; hourly granularities (h) to the start of the hour, etc. | |
integer or string (TimestampOrStringEnd) Get datapoints up to, but excluding, this point in time. Same format as for start. Note that when using aggregates, the end will be rounded up such that the last aggregate represents a full aggregation interval containing the original end, where the interval is the granularity unit times the granularity multiplier. For granularity 2d, the aggregation interval is 2 days, if end was originally 3 days after the start, it will be rounded to 4 days after the start. | |
limit | integer <int32> Default: 100 Returns up to this number of data points. The maximum is 100000 non-aggregated data points and 10000 aggregated data points in total across all queries in a single request. |
aggregates | Array of strings (Aggregate) [ 1 .. 10 ] items unique Items Enum: "average" "max" "min" "count" "sum" "interpolation" "stepInterpolation" "totalVariation" "continuousVariance" "discreteVariance" Specify the aggregates to return. Omit to return data points without aggregation. |
granularity | string The time granularity size and unit to aggregate over. Valid entries are 'day, hour, minute, second', or short forms 'd, h, m, s', or a multiple of these indicated by a number as a prefix. For 'second' and 'minute', the multiple must be an integer between 1 and 120 inclusive; for 'hour' and 'day', the multiple must be an integer between 1 and 100000 inclusive. For example, a granularity '5m' means that aggregates are calculated over 5 minutes. This field is required if aggregates are specified. |
includeOutsidePoints | boolean Default: false Defines whether to include the last data point before the requested time period and the first one after. This option can be useful for interpolating data. It's not available for aggregates or cursors.
Note: If there are more than |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "start": 0,
- "end": 0,
- "limit": 0,
- "aggregates": [
- "average"
], - "granularity": "1h",
- "includeOutsidePoints": false,
- "cursor": "string",
- "id": 1
}
], - "start": 0,
- "end": 0,
- "limit": 100,
- "aggregates": [
- "average"
], - "granularity": "1h",
- "includeOutsidePoints": false,
- "ignoreUnknownIds": false
}
{- "items": [
- {
- "id": 1,
- "externalId": "string",
- "isString": false,
- "isStep": true,
- "unit": "string",
- "nextCursor": "string",
- "datapoints": [
- {
- "timestamp": 1638795554528,
- "average": 0,
- "max": 0,
- "min": 0,
- "count": 0,
- "sum": 0,
- "interpolation": 0,
- "stepInterpolation": 0,
- "continuousVariance": 0,
- "discreteVariance": 0,
- "totalVariation": 0
}
]
}
]
}
Retrieves the latest data point in one or more time series. Note that the latest data point in a time series is the one with the highest timestamp, which is not necessarily the one that was ingested most recently.
The list of the queries to perform.
required | Array of QueryWithInternalId (object) or QueryWithExternalId (object) (LatestDataBeforeRequest) [ 1 .. 100 ] items List of latest queries |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "before": "now",
- "id": 1
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "id": 1,
- "externalId": "string",
- "isString": true,
- "isStep": true,
- "unit": "string",
- "nextCursor": "string",
- "datapoints": [
- {
- "timestamp": 1638795554528,
- "value": 0
}
]
}
]
}
Retrieves one or more time series by ID or external ID. The response returns the time series in the same order as in the request.
List of the IDs of the time series to retrieve.
required | Array of QueryWithInternalId (object) or QueryWithExternalId (object) [ 1 .. 1000 ] items unique List of ID objects. |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "id": 1,
- "externalId": "string",
- "name": "string",
- "isString": true,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "unit": "string",
- "assetId": 1,
- "isStep": true,
- "description": "string",
- "securityCategories": [
- 0
], - "dataSetId": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Fulltext search for time series based on result relevance. Primarily meant for human-centric use cases, not for programs, since matching and order may change over time. Additional filters can also be specified. This operation does not support pagination.
object (Filter) | |
object (Search) | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Return up to this many results. |
{- "filter": {
- "name": "string",
- "unit": "string",
- "isString": true,
- "isStep": true,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "assetExternalIds": [
- "my.known.id"
], - "rootAssetIds": [
- 343099548723932,
- 88483999203217
], - "assetSubtreeIds": [
- {
- "id": 1
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "externalIdPrefix": "my.known.prefix",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}
}, - "search": {
- "name": "string",
- "description": "string",
- "query": "some other"
}, - "limit": 100
}
{- "items": [
- {
- "id": 1,
- "externalId": "string",
- "name": "string",
- "isString": true,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "unit": "string",
- "assetId": 1,
- "isStep": true,
- "description": "string",
- "securityCategories": [
- 0
], - "dataSetId": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Updates one or more time series. Fields outside of the request remain unchanged.
For primitive fields (those whose type is string, number, or boolean), use "set": value
to update the value; use "setNull": true
to set the field to null.
For JSON array fields (for example securityCategories
), use "set": [value1, value2]
to
update the value; use "add": [value1, value2]
to add values; use
"remove": [value1, value2]
to remove values.
List of changes.
required | Array of TimeSeriesUpdateById (object) or TimeSeriesUpdateByExternalId (object) (TimeSeriesUpdate) [ 1 .. 1000 ] items |
{- "items": [
- {
- "update": {
- "externalId": {
- "set": "string"
}, - "name": {
- "set": "string"
}, - "metadata": {
- "set": {
- "key1": "value1",
- "key2": "value2"
}
}, - "unit": {
- "set": "string"
}, - "assetId": {
- "set": 0
}, - "isStep": {
- "set": true
}, - "description": {
- "set": "string"
}, - "securityCategories": {
- "set": [
- 0
]
}, - "dataSetId": {
- "set": 0
}
}, - "id": 1
}
]
}
{- "items": [
- {
- "id": 1,
- "externalId": "string",
- "name": "string",
- "isString": true,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "unit": "string",
- "assetId": 1,
- "isStep": true,
- "description": "string",
- "securityCategories": [
- 0
], - "dataSetId": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Synthetic Time Series (STS) is a way to combine various input time series, constants and operators, to create completely new time series.
For example can we use the expression 24 * TS{externalId='production/hour'}
to convert from hourly to daily production rates.
But STS is not limited to simple conversions.
TS{id=123} + TS{externalId='hei'}
.sin(pow(TS{id=123}, 2))
.TS{id=123, aggregate='average', granularity='1h'}+TS{id=456}
To learn more about synthetic time series please follow our guide.
Execute an on-the-fly synthetic query
The list of queries to perform
required | Array of objects (SyntheticQuery) [ 1 .. 10 ] items |
{- "items": [
- {
- "expression": "(5 + TS{externalId='hello'}) / TS{id=123, aggregate='average', granularity='1h'}",
- "start": 0,
- "end": 0,
- "limit": 100
}
]
}
{- "items": [
- {
- "isString": false,
- "datapoints": [
- {
- "timestamp": 0,
- "value": 0
}
]
}
]
}
Event objects store complex information about multiple assets over a time period. For example, an event can describe two hours of maintenance on a water pump and some associated pipes, or a future time window where the pump is scheduled for inspection. This is in contrast with data points in time series that store single pieces of information about one asset at specific points in time (e.g., temperature measurements).
An event’s time period is defined by a start time and end time, both millisecond timestamps since the UNIX epoch. The timestamps can be in the future. In addition, events can have a text description as well as arbitrary metadata and properties.
Asset references obtained from an event - through asset ids - may be invalid, simply by the non-transactional nature of HTTP. They are maintained in an eventual consistent manner.
Rate and concurrency limits apply to some of the endpoints. If a request exceeds one of the limits,
it will be throttled with a 429: Too Many Requests
response. More on limit types
and how to avoid being throttled is described
here.
Following limits apply to the List events, Filter events, Aggregate events and Search events endpoints. These limits apply to all endpoints simultaneously, i.e. requests made to different endpoints are counted together. Please note the additional conditions that apply to the Aggregate events endpoint, as this endpoint provides the most resource-consuming operations.
Limit | Per project | Per user (identity) |
---|---|---|
Rate | 30 rps total out of which no more than 15 rps to Aggregate |
20 rps out of which no more than 10 rps to Aggregate |
Concurrency | 15 parallel requests out of which no more than 6 to Aggregate |
10 parallel requests out of which no more than 4 to Aggregate |
The aggregation API lets you compute aggregated results on events, such as getting the count of all Events in a project, checking different descriptions of events in your project, etc.
Filters behave the same way as for the Filter events
endpoint.
In text properties, the values are aggregated in a case-insensitive manner.
aggregateFilter
works similarly to advancedFilter
but always applies to aggregate properties.
For instance, in an aggregation for the source
property, only the values (aka buckets) of the source
property can be filtered out.
This endpoint is meant for data analytics/exploration usage and is not suitable for high load data retrieval usage.
It is a subject of the new throttling schema (limited request rate and concurrency).
Please check Events resource description for more information.
aggregate | string Value: "count" Type of aggregation to apply.
| ||||||||||||||||||||||||||||||
(BoolFilter (and (object) or or (object) or not (object))) or (LeafFilter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or search (object))) (EventAdvancedFilter) A filter DSL (Domain Specific Language) to define advanced filter queries. See more information about filtering DSL here. Supported properties:
Note: Filtering on the | |||||||||||||||||||||||||||||||
object (EventFilter) Filter on events filter with exact match |
{- "aggregate": "count",
- "advancedFilter": {
- "or": [
- {
- "not": {
- "and": [
- {
- "equals": {
- "property": [
- "metadata",
- "severity"
], - "value": "medium"
}
}, - {
- "in": {
- "property": [
- "source"
], - "values": [
- "inspection protocol",
- "incident report"
]
}
}, - {
- "range": {
- "property": [
- "dataSetId"
], - "gte": 1,
- "lt": 10
}
}
]
}
}, - {
- "and": [
- {
- "equals": {
- "property": [
- "type"
], - "value": "equipment malfunction"
}
}, - {
- "equals": {
- "property": [
- "subtype"
], - "value": "mechanical failure"
}
}
]
}, - {
- "search": {
- "property": [
- "description"
], - "value": "outage"
}
}
]
}, - "filter": {
- "startTime": {
- "max": 0,
- "min": 0
}, - "endTime": {
- "max": 0,
- "min": 0
}, - "activeAtTime": {
- "max": 0,
- "min": 0
}, - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "assetExternalIds": [
- "my.known.id"
], - "assetSubtreeIds": [
- {
- "id": 1
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "source": "string",
- "type": "string",
- "subtype": "string",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "externalIdPrefix": "my.known.prefix"
}
}
{- "items": [
- {
- "count": 10
}
]
}
Creates multiple event objects in the same project. It is possible to post a maximum of 1000 events per request.
List of events to be posted. It is possible to post a maximum of 1000 events per request.
required | Array of objects (ExternalEvent) [ 1 .. 1000 ] items |
{- "items": [
- {
- "externalId": "my.known.id",
- "dataSetId": 1,
- "startTime": 0,
- "endTime": 0,
- "type": "string",
- "subtype": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "source": "string"
}
]
}
{- "items": [
- {
- "externalId": "my.known.id",
- "dataSetId": 1,
- "startTime": 0,
- "endTime": 0,
- "type": "string",
- "subtype": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "source": "string",
- "id": 1,
- "lastUpdatedTime": 0,
- "createdTime": 0
}
]
}
Deletes events with the given ids. A maximum of 1000 events can be deleted per request.
List of IDs to delete.
required | Array of InternalId (object) or ExternalId (object) (EitherId) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{ }
Retrieve a list of events in the same project. This operation supports pagination by cursor. Apply Filtering and Advanced filtering criteria to select a subset of events.
Advanced filter lets you create complex filtering expressions that combine simple operations,
such as equals
, prefix
, exists
, etc., using boolean operators and
, or
, and not
.
It applies to basic fields as well as metadata.
See the advancedFilter
attribute in the example.
See more information about filtering DSL here.
Leaf filter | Supported fields | Description |
---|---|---|
containsAll |
Array type fields | Only includes results which contain all of the specified values. {"containsAll": {"property": ["property"], "values": [1, 2, 3]}} |
containsAny |
Array type fields | Only includes results which contain at least one of the specified values. {"containsAny": {"property": ["property"], "values": [1, 2, 3]}} |
equals |
Non-array type fields | Only includes results that are equal to the specified value. {"equals": {"property": ["property"], "value": "example"}} |
exists |
All fields | Only includes results where the specified property exists (has value). {"exists": {"property": ["property"]}} |
in |
Non-array type fields | Only includes results that are equal to one of the specified values. {"in": {"property": ["property"], "values": [1, 2, 3]}} |
prefix |
String type fields | Only includes results which start with the specified value. {"prefix": {"property": ["property"], "value": "example"}} |
range |
Non-array type fields | Only includes results that fall within the specified range. {"range": {"property": ["property"], "gt": 1, "lte": 5}} Supported operators: gt , lt , gte , lte |
search |
["description"] |
Introduced to provide functional parity with /events/search endpoint. {"search": {"property": ["property"], "value": "example"}} |
The search
leaf filter provides functional parity with the /events/search
endpoint.
It's available only for the ["description"]
field. When specifying only this filter with no explicit ordering,
behavior is the same as of the /events/search/
endpoint without specifying filters.
Explicit sorting overrides the default ordering by relevance.
It's possible to use the search
leaf filter as any other leaf filter for creating complex queries.
See the search
filter in the advancedFilter
attribute in the example.
and
and or
clauses must have at least one elementproperty
array of each leaf filter has the following limitations:containsAll
, containsAny
, and in
filter values
array size must be in the range [1, 100]containsAll
, containsAny
, and in
filter values
array must contain elements of a primitive type (number, string)range
filter must have at least one of gt
, gte
, lt
, lte
attributes.
But gt
is mutually exclusive to gte
, while lt
is mutually exclusive to lte
.
For metadata, both upper and lower bounds must be specified.gt
, gte
, lt
, lte
in the range
filter must be a primitive valuesearch
filter value
must not be blank and the length must be in the range [1, 128]externalId
- 255description
- 128 for the search
filter and 255 for other filterstype
- 64subtype
- 64source
- 128metadata
key - 128By default, events are sorted by their creation time in the ascending order.
Use the search
leaf filter to sort the results by relevance.
Sorting by other fields can be explicitly requested. The order
field is optional and defaults
to desc
for _score_
and asc
for all other fields.
The nulls
field is optional and defaults to auto
. auto
is translated to last
for the asc
order and to first
for the desc
order by the service.
Partitions are done independently of sorting: there's no guarantee of the sort order between elements from different partitions.
See the sort
attribute in the example.
In case the nulls
attribute has the auto
value or the attribute isn't specified,
null (missing) values are considered to be bigger than any other values.
They are placed last when sorting in the asc
order and first when sorting in desc
.
Otherwise, missing values are placed according to the nulls
attribute (last or first), and their placement doesn't depend on the order
value.
Values, such as empty strings, aren't considered as nulls.
Use a special sort property _score_
when sorting by relevance.
The more filters a particular event matches, the higher its score is. This can be useful,
for example, when building UIs. Let's assume we want exact matches to be be displayed above matches by
prefix as in the request below. An event with the type fire
will match both equals
and prefix
filters and, therefore, have higher score than events with names like fire training
that match only the prefix
filter.
"advancedFilter" : {
"or" : [
{
"equals": {
"property": ["type"],
"value": "fire"
}
},
{
"prefix": {
"property": ["type"],
"value": "fire"
}
}
]
},
"sort": [
{
"property" : ["_score_"]
}
]
This endpoint is meant for data analytics/exploration usage and is not suitable for high load data retrieval usage. It is a subject of the new throttling schema (limited request rate and concurrency). Please check Events resource description for more information.
object (EventFilter) Filter on events filter with exact match | |||||||||||||||||||||||||||||||
(BoolFilter (and (object) or or (object) or not (object))) or (LeafFilter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or search (object))) (EventAdvancedFilter) A filter DSL (Domain Specific Language) to define advanced filter queries. See more information about filtering DSL here. Supported properties:
Note: Filtering on the | |||||||||||||||||||||||||||||||
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Limits the maximum number of results to be returned by a single request. In case there are more results to the request, the 'nextCursor' attribute will be provided as part of the response. Request may contain less results than the request limit. | ||||||||||||||||||||||||||||||
Array of modern (objects) or Array of deprecated (strings) | |||||||||||||||||||||||||||||||
cursor | string | ||||||||||||||||||||||||||||||
partition | string (Partition) Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
{- "filter": {
- "startTime": {
- "max": 0,
- "min": 0
}, - "endTime": {
- "max": 0,
- "min": 0
}, - "activeAtTime": {
- "max": 0,
- "min": 0
}, - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "assetExternalIds": [
- "my.known.id"
], - "assetSubtreeIds": [
- {
- "id": 1
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "source": "string",
- "type": "string",
- "subtype": "string",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "externalIdPrefix": "my.known.prefix"
}, - "advancedFilter": {
- "or": [
- {
- "not": {
- "and": [
- {
- "equals": {
- "property": [
- "metadata",
- "severity"
], - "value": "medium"
}
}, - {
- "in": {
- "property": [
- "source"
], - "values": [
- "inspection protocol",
- "incident report"
]
}
}, - {
- "range": {
- "property": [
- "dataSetId"
], - "gte": 1,
- "lt": 10
}
}
]
}
}, - {
- "and": [
- {
- "equals": {
- "property": [
- "type"
], - "value": "equipment malfunction"
}
}, - {
- "equals": {
- "property": [
- "subtype"
], - "value": "mechanical failure"
}
}
]
}, - {
- "search": {
- "property": [
- "description"
], - "value": "outage"
}
}
]
}, - "limit": 100,
- "sort": [
- {
- "property": [
- "createdTime"
], - "order": "desc"
}, - {
- "property": [
- "metadata",
- "FooBar"
], - "nulls": "first"
}
], - "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo",
- "partition": "1/10"
}
{- "items": [
- {
- "externalId": "my.known.id",
- "dataSetId": 1,
- "startTime": 0,
- "endTime": 0,
- "type": "string",
- "subtype": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "source": "string",
- "id": 1,
- "lastUpdatedTime": 0,
- "createdTime": 0
}
], - "nextCursor": "string"
}
List events optionally filtered on query parameters
This endpoint is meant for data analytics/exploration usage and is not suitable for high load data retrieval usage. It is a subject of the new throttling schema (limited request rate and concurrency). Please check Events resource description for more information.
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
minStartTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxStartTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
minEndTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxEndTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
minActiveAtTime | integer <int64> >= 0 Event is considered active from its startTime to endTime inclusive. If startTime is null, event is never active. If endTime is null, event is active from startTime onwards. activeAtTime filter will match all events that are active at some point from min to max, from min, or to max, depending on which of min and max parameters are specified. |
maxActiveAtTime | integer <int64> >= 0 Event is considered active from its startTime to endTime inclusive. If startTime is null, event is never active. If endTime is null, event is active from startTime onwards. activeAtTime filter will match all events that are active at some point from min to max, from min, or to max, depending on which of min and max parameters are specified. |
assetIds | string <jsonArray(int64)> (JsonArrayInt64) Example: assetIds=[363848954441724, 793045462540095, 1261042166839739] Asset IDs of equipment that this event relates to. Format is list of IDs serialized as JSON array(int64). Takes [ 1 .. 100 ] of unique items. |
assetExternalIds | string <jsonArray(string)> (JsonArrayString) Example: assetExternalIds=["externalId1", "externalId2", "externalId3"] Asset external IDs of equipment that this event relates to. Takes 1..100 unique items. |
assetSubtreeIds | string <jsonArray(int64)> (JsonArrayInt64) Example: assetSubtreeIds=[363848954441724, 793045462540095, 1261042166839739] Only include events that have a related asset in a subtree rooted at any of these assetIds (including the roots given). If the total size of the given subtrees exceeds 100,000 assets, an error will be returned. |
assetSubtreeExternalIds | string <jsonArray(string)> (JsonArrayString) Example: assetSubtreeExternalIds=["externalId1", "externalId2", "externalId3"] Only include events that have a related asset in a subtree rooted at any of these assetExternalIds (including the roots given). If the total size of the given subtrees exceeds 100,000 assets, an error will be returned. |
source | string <= 128 characters The source of this event. |
type | string (EventType) <= 64 characters Type of the event, e.g 'failure'. |
subtype | string (EventSubType) <= 64 characters SubType of the event, e.g 'electrical'. |
minCreatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxCreatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
minLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
externalIdPrefix | string (CogniteExternalIdPrefix) <= 255 characters Example: externalIdPrefix=my.known.prefix Filter by this (case-sensitive) prefix for the external ID. |
partition | string Example: partition=1/10 Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
includeMetadata | boolean Default: true Whether the metadata field should be returned or not. |
sort | Array of strings Example: sort=endTime:desc Sort by an array of the selected fields. Syntax: |
const events = await client.events.list({ filter: { startTime: { min: new Date('1 jan 2018') }, endTime: { max: new Date('1 jan 2019') } } });
{- "items": [
- {
- "externalId": "my.known.id",
- "dataSetId": 1,
- "startTime": 0,
- "endTime": 0,
- "type": "string",
- "subtype": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "source": "string",
- "id": 1,
- "lastUpdatedTime": 0,
- "createdTime": 0
}
], - "nextCursor": "string"
}
id required | integer <int64> (CogniteInternalId) [ 1 .. 9007199254740991 ] A server-generated ID for the object. |
const events = await client.events.retrieve([{id: 123}, {externalId: 'abc'}]);
{- "externalId": "my.known.id",
- "dataSetId": 1,
- "startTime": 0,
- "endTime": 0,
- "type": "string",
- "subtype": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "source": "string",
- "id": 1,
- "lastUpdatedTime": 0,
- "createdTime": 0
}
Retrieves information about events in the same project. Events are returned in the same order as the ids listed in the query.
A maximum of 1000 event IDs may be listed per request and all of them must be unique.
List of IDs of events to retrieve. Must be up to a maximum of 1000 IDs, and all of them must be unique.
required | Array of InternalId (object) or ExternalId (object) (EitherId) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "externalId": "my.known.id",
- "dataSetId": 1,
- "startTime": 0,
- "endTime": 0,
- "type": "string",
- "subtype": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "source": "string",
- "id": 1,
- "lastUpdatedTime": 0,
- "createdTime": 0
}
]
}
Fulltext search for events based on result relevance. Primarily meant for human-centric use-cases, not for programs, since matching and ordering may change over time. Additional filters can also be specified. This operation doesn't support pagination.
This endpoint is meant for data analytics/exploration usage and is not suitable for high load data retrieval usage. It is a subject of the new throttling schema (limited request rate and concurrency). Please check Events resource description for more information.
object (EventFilter) Filter on events filter with exact match | |
object (EventSearch) | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 <- Limits the maximum number of results to be returned by single request. Request may contain less results than request limit. |
{- "filter": {
- "startTime": {
- "max": 0,
- "min": 0
}, - "endTime": {
- "max": 0,
- "min": 0
}, - "activeAtTime": {
- "max": 0,
- "min": 0
}, - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "assetExternalIds": [
- "my.known.id"
], - "assetSubtreeIds": [
- {
- "id": 1
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "source": "string",
- "type": "string",
- "subtype": "string",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "externalIdPrefix": "my.known.prefix"
}, - "search": {
- "description": "string"
}, - "limit": 100
}
{- "items": [
- {
- "externalId": "my.known.id",
- "dataSetId": 1,
- "startTime": 0,
- "endTime": 0,
- "type": "string",
- "subtype": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "source": "string",
- "id": 1,
- "lastUpdatedTime": 0,
- "createdTime": 0
}
]
}
Updates events in the same project. This operation supports partial updates; Fields omitted from queries will remain unchanged on objects.
For primitive fields (String, Long, Int), use 'set': 'value' to update value; use 'setNull': true to set that field to null.
For the Json Array field (e.g. assetIds), use 'set': [value1, value2] to update value; use 'add': [v1, v2] to add values to current list of values; use 'remove': [v1, v2] to remove these values from current list of values if exists.
A maximum of 1000 events can be updated per request, and all of the event IDs must be unique.
List of changes. A maximum of 1000 events can be updated per request, and all of the event IDs must be unique.
required | Array of EventChangeById (object) or EventChangeByExternalId (object) (EventChange) [ 1 .. 1000 ] items |
{- "items": [
- {
- "update": {
- "externalId": {
- "set": "my.known.id"
}, - "dataSetId": {
- "set": 0
}, - "startTime": {
- "set": 0
}, - "endTime": {
- "set": 0
}, - "description": {
- "set": "string"
}, - "metadata": {
- "set": {
- "key1": "value1",
- "key2": "value2"
}
}, - "assetIds": {
- "set": [
- 0
]
}, - "source": {
- "set": "string"
}, - "type": {
- "set": "string"
}, - "subtype": {
- "set": "string"
}
}, - "id": 1
}
]
}
{- "items": [
- {
- "externalId": "my.known.id",
- "dataSetId": 1,
- "startTime": 0,
- "endTime": 0,
- "type": "string",
- "subtype": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "source": "string",
- "id": 1,
- "lastUpdatedTime": 0,
- "createdTime": 0
}
]
}
A file stores a sequence of bytes connected to one or more assets. For example, a file can contain a piping and instrumentation diagram (P&IDs) showing how multiple assets are connected.
Each file is identified by the 'id' field, which is generated internally for each new file. Each file's 'id' field is unique within a project.
The 'externalId' field is optional, but can also be used to identify a file. The 'externalId' (if used) must be unique within a project.
Files are created in two steps; First the metadata is stored in a file object, and then the file contents are uploaded. This means that files can exist in a non-uploaded state. The upload state is reflected in the 'uploaded' field in responses.
Asset references obtained from a file - through asset ids - may be invalid, simply by the non-transactional nature of HTTP. They are maintained in an eventual consistent manner.
Calculate aggregates for files, based on optional filter specification. Returns the following aggregates: count
Files aggregate request body
object |
{- "filter": {
- "name": "string",
- "directoryPrefix": "/my/known/directory",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "assetExternalIds": [
- "externalId1",
- "externalId2",
- "externalId3"
], - "rootAssetIds": [
- {
- "id": 123456789
}, - {
- "externalId": "system 99 external Id 1234"
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "assetSubtreeIds": [
- {
- "id": 123456789
}, - {
- "externalId": "system 99 external Id 1234"
}
], - "source": "string",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "uploadedTime": {
- "max": 0,
- "min": 0
}, - "sourceCreatedTime": {
- "max": 0,
- "min": 0
}, - "sourceModifiedTime": {
- "max": 0,
- "min": 0
}, - "externalIdPrefix": "my.known.prefix",
- "uploaded": true,
- "labels": {
- "containsAny": [
- {
- "externalId": "my.known.id"
}
]
}, - "geoLocation": {
- "relation": "INTERSECTS",
- "shape": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}
}
}
}
{- "items": [
- {
- "count": 0
}
]
}
Deletes the files with the given ids.
A maximum of 1000 files can be deleted per request.
List of IDs of files to delete.
Array of FileInternalId (object) or FileExternalId (object) (FileIdEither) [ 1 .. 1000 ] items |
{- "items": [
- {
- "id": 1
}
]
}
{ }
Retrieves a list of download URLs for the specified list of file IDs. After getting the download links, the client has to issue a GET request to the returned URLs, which will respond with the contents of the file. The links will by default expire after 30 seconds. If providing the query parameter extendedExpiration the links will expire after 1 hour.
extendedExpiration | boolean Default: false if set to true, will extend the expiration period of the link to 1 hour. |
List of file IDs to retrieve the download URL for.
Array of FileInternalId (object) or FileExternalId (object) (FileIdEither) [ 1 .. 100 ] items |
{- "items": [
- {
- "id": 1
}
]
}
{- "items": [
- {
- "downloadUrl": "string",
- "id": 1
}
]
}
Retrieves a list of all files in a project. Criteria can be supplied to select a subset of files. This operation supports pagination with cursors.
The project name
object | |
partition | string (Partition) Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 <- Maximum number of items that the client want to get back. |
cursor | string |
{- "filter": {
- "name": "string",
- "directoryPrefix": "/my/known/directory",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "assetExternalIds": [
- "externalId1",
- "externalId2",
- "externalId3"
], - "rootAssetIds": [
- {
- "id": 123456789
}, - {
- "externalId": "system 99 external Id 1234"
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "assetSubtreeIds": [
- {
- "id": 123456789
}, - {
- "externalId": "system 99 external Id 1234"
}
], - "source": "string",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "uploadedTime": {
- "max": 0,
- "min": 0
}, - "sourceCreatedTime": {
- "max": 0,
- "min": 0
}, - "sourceModifiedTime": {
- "max": 0,
- "min": 0
}, - "externalIdPrefix": "my.known.prefix",
- "uploaded": true,
- "labels": {
- "containsAny": [
- {
- "externalId": "my.known.id"
}
]
}, - "geoLocation": {
- "relation": "INTERSECTS",
- "shape": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}
}
}, - "partition": "1/10",
- "limit": 100,
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo"
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "directory": "string",
- "source": "string",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "dataSetId": 1,
- "sourceCreatedTime": 0,
- "sourceModifiedTime": 0,
- "securityCategories": [
- 1
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1,
- "uploaded": true,
- "uploadedTime": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
The GET /files/icon operation can be used to get an image representation of a file.
Either id or externalId must be provided as a query parameter (but not both). Supported file formats:
id | integer <int64> (CogniteInternalId) [ 1 .. 9007199254740991 ] A server-generated ID for the object. |
externalId | string (CogniteExternalId) <= 255 characters Example: externalId=my.known.id The external ID provided by the client. Must be unique for the resource type. |
{- "error": {
- "code": 401,
- "message": "Could not authenticate.",
- "missing": [
- { }
], - "duplicated": [
- { }
]
}
}
The GET /files operation can be used to return information for all files in a project.
Optionally you can add one or more of the following query parameters. The filter query parameters will filter the results to only include files that match all filter parameters.
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
name | string (FileName) <= 256 characters Name of the file. |
mimeType | string (MimeType) <= 256 characters Example: mimeType=image/jpeg File type. E.g. text/plain, application/pdf, .. |
source | string (FileSource) <= 128 characters The source of the file. |
assetIds | Array of integers <int64> (AssetIds) [ 1 .. 100 ] items unique [ items <int64 > [ 1 .. 9007199254740991 ] ] Example: assetIds=363848954441724&assetIds=793045462540095&assetIds=1261042166839739 Only include files that reference these specific asset IDs. |
assetExternalIds | string <jsonArray(string)> (JsonArrayString) Example: assetExternalIds=["externalId1", "externalId2", "externalId3"] Asset external IDs of related equipment that this file relates to. Takes 1..100 unique items. |
Array of DataSetInternalId (object) or DataSetExternalId (object) (DataSetIdEithers) | |
rootAssetIds | string <jsonArray(int64)> (JsonArrayInt64) Example: rootAssetIds=[363848954441724, 793045462540095, 1261042166839739] Only include files that have a related asset in a tree rooted at any of these root assetIds. |
assetSubtreeIds | string <jsonArray(int64)> (JsonArrayInt64) Example: assetSubtreeIds=[363848954441724, 793045462540095, 1261042166839739] Only include files that have a related asset in a subtree rooted at any of these assetIds (including the roots given). If the total size of the given subtrees exceeds 100,000 assets, an error will be returned. |
assetSubtreeExternalIds | string <jsonArray(string)> (JsonArrayString) Example: assetSubtreeExternalIds=["externalId1", "externalId2", "externalId3"] Only include files that have a related asset in a subtree rooted at any of these assetExternalIds (including the roots given). If the total size of the given subtrees exceeds 100,000 assets, an error will be returned. |
minCreatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxCreatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
minLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
minUploadedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxUploadedTime | integer <int64> (EpochTimestamp) >= 0 The number of milliseconds since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
minSourceCreatedTime | integer <int64> (EpochTimestamp) >= 0 Include files that have sourceCreatedTime set and with minimum this value. |
maxSourceCreatedTime | integer <int64> (EpochTimestamp) >= 0 Include files that have sourceCreatedTime set and with maximum this value. |
minSourceModifiedTime | integer <int64> (EpochTimestamp) >= 0 Include files that have sourceModifiedTime set and with minimum this value. |
maxSourceModifiedTime | integer <int64> (EpochTimestamp) >= 0 Include files that have sourceModifiedTime set and with maximum this value. |
externalIdPrefix | string (CogniteExternalIdPrefix) <= 255 characters Example: externalIdPrefix=my.known.prefix Filter by this (case-sensitive) prefix for the external ID. |
uploaded | boolean Example: uploaded=true Whether or not the actual file is uploaded. This field is returned only by the API, it has no effect in a post body. |
partition | string Example: partition=1/10 Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
const files = await client.files.list({filter: {mimeType: 'image/png'}});
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "directory": "string",
- "source": "string",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "dataSetId": 1,
- "sourceCreatedTime": 0,
- "sourceModifiedTime": 0,
- "securityCategories": [
- 1
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1,
- "uploaded": true,
- "uploadedTime": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
Returns file info for the file ID
id required | integer <int64> (CogniteInternalId) [ 1 .. 9007199254740991 ] A server-generated ID for the object. |
const files = await client.files.retrieve([{id: 123}, {externalId: 'abc'}]);
{- "externalId": "my.known.id",
- "name": "string",
- "directory": "string",
- "source": "string",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "dataSetId": 1,
- "sourceCreatedTime": 0,
- "sourceModifiedTime": 0,
- "securityCategories": [
- 1
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1,
- "uploaded": true,
- "uploadedTime": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
Retrieves metadata information about multiple specific files in the same project. Results are returned in the same order as in the request. This operation does not return the file contents.
List of IDs of files to retrieve. Must be up to a maximum of 1000 IDs, and all of them must be unique.
required | Array of Select by Id (object) or Select by ExternalId (object) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "directory": "string",
- "source": "string",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "dataSetId": 1,
- "sourceCreatedTime": 0,
- "sourceModifiedTime": 0,
- "securityCategories": [
- 1
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1,
- "uploaded": true,
- "uploadedTime": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Search for files based on relevance. You can also supply a strict match filter as in Filter files, and search in the results from the filter. Returns first 1000 results based on relevance. This operation does not support pagination.
object | |
object |
{- "filter": {
- "name": "string",
- "directoryPrefix": "/my/known/directory",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "assetExternalIds": [
- "externalId1",
- "externalId2",
- "externalId3"
], - "rootAssetIds": [
- {
- "id": 123456789
}, - {
- "externalId": "system 99 external Id 1234"
}
], - "dataSetIds": [
- {
- "id": 1
}
], - "assetSubtreeIds": [
- {
- "id": 123456789
}, - {
- "externalId": "system 99 external Id 1234"
}
], - "source": "string",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "uploadedTime": {
- "max": 0,
- "min": 0
}, - "sourceCreatedTime": {
- "max": 0,
- "min": 0
}, - "sourceModifiedTime": {
- "max": 0,
- "min": 0
}, - "externalIdPrefix": "my.known.prefix",
- "uploaded": true,
- "labels": {
- "containsAny": [
- {
- "externalId": "my.known.id"
}
]
}, - "geoLocation": {
- "relation": "INTERSECTS",
- "shape": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}
}
}, - "search": {
- "name": "string"
}
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "directory": "string",
- "source": "string",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "dataSetId": 1,
- "sourceCreatedTime": 0,
- "sourceModifiedTime": 0,
- "securityCategories": [
- 1
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1,
- "uploaded": true,
- "uploadedTime": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Updates the information for the files specified in the request body.
If you want to update the file content, uploaded using the uploadUrl, please use the initFileUpload request with the query parameter 'overwrite=true'. Alternatively, delete and recreate the file.
For primitive fields (String, Long, Int), use 'set': 'value' to update value; use 'setNull': true to set that field to null.
For the Json Array field (e.g. assetIds and securityCategories): Use either only 'set', or a combination of 'add' and/or 'remove'.
AssetIds update examples:
Example request body to overwrite assetIds with a new set, asset ID 1 and 2.
{
"items": [
{
"id": 1,
"update": {
"assetIds" : {
"set" : [ 1, 2 ]
}
}
}
]
}
Example request body to add one asset Id, and remove another asset ID.
{
"items": [
{
"id": 1,
"update": {
"assetIds" : {
"add" : [ 3 ],
"remove": [ 2 ]
}
}
}
]
}
Metadata update examples:
Example request body to overwrite metadata with a new set.
{
"items": [
{
"id": 1,
"update": {
"metadata": {
"set": {
"key1": "value1",
"key2": "value2"
}
}
}
}
]
}
Example request body to add two key-value pairs and remove two other key-value pairs by key for the metadata field.
{
"items": [
{
"id": 1,
"update": {
"metadata": {
"add": {
"key3": "value3",
"key4": "value4"
},
"remove": [
"key1",
"key2"
]
}
}
}
]
}
The JSON request body which specifies which files and fields to update.
required | Array of FileChangeUpdateById (object) or FileChangeUpdateByExternalId (object) (FileChangeUpdate) [ 1 .. 1000 ] items |
{- "items": [
- {
- "id": 1,
- "update": {
- "externalId": {
- "set": "string"
}, - "directory": {
- "set": "string"
}, - "source": {
- "set": "string"
}, - "mimeType": {
- "set": "string"
}, - "metadata": {
- "set": {
- "key1": "value1",
- "key2": "value2"
}
}, - "assetIds": {
- "set": [
- 0
]
}, - "sourceCreatedTime": {
- "set": 0
}, - "sourceModifiedTime": {
- "set": 0
}, - "dataSetId": {
- "set": 0
}, - "securityCategories": {
- "set": [
- 0
]
}, - "labels": {
- "add": [
- {
- "externalId": "my.known.id"
}
], - "remove": [
- {
- "externalId": "my.known.id"
}
]
}, - "geoLocation": {
- "set": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}
}
}
}
]
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "directory": "string",
- "source": "string",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "dataSetId": 1,
- "sourceCreatedTime": 0,
- "sourceModifiedTime": 0,
- "securityCategories": [
- 1
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1,
- "uploaded": true,
- "uploadedTime": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Create metadata information and get an upload link for a file.
To upload the file, use the uploadUrl link in the response in a separate request. To upload a file, send an HTTP PUT request to the uploadUrl with the relevant 'Content-Type' and 'Content-Length' headers.
If the uploadUrl contains the string '/v1/files/gcs_proxy/', you can make a Google Cloud Storage (GCS) resumable upload request as documented in https://cloud.google.com/storage/docs/json_api/v1/how-tos/resumable-upload.
The uploadUrl expires after one week. Any file info entry that does not have the actual file uploaded within one week will be automatically deleted.
overwrite | boolean Default: false If 'overwrite' is set to true, and the POST body content specifies a 'externalId' field, fields for the file found for externalId can be overwritten. The default setting is false. If metadata is included in the request body, all of the original metadata will be overwritten. The actual file will be overwritten after a successful upload with the uploadUrl from the response. If there is no successful upload, the current file contents will be kept. File-Asset mappings only change if explicitly stated in the assetIds field of the POST json body. Do not set assetIds in request body if you want to keep the current file-asset mappings. |
Origin | string The 'Origin' header parameter is required if there is a Cross Origin issue. |
Fields to be set for the file.
externalId | string (CogniteExternalId) <= 255 characters The external ID provided by the client. Must be unique for the resource type. |
name required | string (FileName) <= 256 characters Name of the file. |
directory | string (FileDirectory) <= 512 characters Directory containing the file. Must be an absolute, unix-style path. |
source | string (FileSource) <= 128 characters The source of the file. |
mimeType | string (MimeType) <= 256 characters File type. E.g. text/plain, application/pdf, .. |
object (FilesMetadataField) Custom, application specific metadata. String key -> String value. Limits: Maximum length of key is 128 bytes, value 10240 bytes, up to 256 key-value pairs, of total size at most 10240. | |
assetIds | Array of integers <int64> (CogniteInternalId) [ 1 .. 1000 ] items [ items <int64 > [ 1 .. 9007199254740991 ] ] |
dataSetId | integer <int64> (DataSetId) [ 1 .. 9007199254740991 ] The dataSet Id for the item. |
sourceCreatedTime | integer <int64> >= 0 The timestamp for when the file was originally created in the source system. |
sourceModifiedTime | integer <int64> >= 0 The timestamp for when the file was last modified in the source system. |
securityCategories | Array of integers <int64> (CogniteInternalId) [ 0 .. 100 ] items [ items <int64 > [ 1 .. 9007199254740991 ] ] The security category IDs required to access this file. |
Array of objects (LabelList) [ 0 .. 10 ] items unique A list of the labels associated with this resource item. | |
object (GeoLocation) Geographic metadata. |
{- "externalId": "my.known.id",
- "name": "string",
- "directory": "string",
- "source": "string",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "dataSetId": 1,
- "sourceCreatedTime": 0,
- "sourceModifiedTime": 0,
- "securityCategories": [
- 1
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}
}
{- "externalId": "my.known.id",
- "name": "string",
- "directory": "string",
- "source": "string",
- "mimeType": "image/jpeg",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "assetIds": [
- 1
], - "dataSetId": 1,
- "sourceCreatedTime": 0,
- "sourceModifiedTime": 0,
- "securityCategories": [
- 1
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1,
- "uploaded": true,
- "uploadedTime": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "uploadUrl": "string"
}
A sequence stores a table with up to 400 columns indexed by row number. There can be at most 400 numeric columns and 200 string columns. Each of the columns has a pre-defined type: a string, integer, or floating point number.
For example, a sequence can represent a curve, either with the dependent variable x as the row number and a single value column y, or can simply store (x,y) pairs in the rows directly. Other potential applications include data logs in which the index isn't time-based. To learn more about sequences, see the concept guide.
The aggregation API allows you to compute aggregated results from a set of sequences, such as
getting the number of sequences in a project or checking what assets the different sequences
in your project are associated with (along with the number of sequences for each asset).
By specifying filter
and/or advancedFilter
, the aggregation will take place only over those
sequences that match the filters. filter
and advancedFilter
behave the same way as in the
list
endpoint.
aggregate
field is not specified the request body, is to return the
number of sequences that match the filters (if any), which is the same behavior as when the
aggregate
field is set to count
.
The following requests will both return the total number of
sequences whose name
begins with pump
:
{
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
and
{
"aggregate": "count",
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The response might be:
{"items": [{"count": 42}]}
aggregate
to uniqueValues
and specifying a property in properties
(this field is an array, but currently only supports one property) will
return all unique values (up to a maximum of 1000) that are taken on by that property
across all the sequences that match the filters, as well as the number of sequences that
have each of those property values.
This example request finds all the unique asset ids that are
referenced by the sequences in your project whose name
begins with pump
:
{
"aggregate": "uniqueValues",
"properties": [{"property": ["assetId"]}],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The response might be the following, saying that 23 sequences are associated with asset 18 and 107 sequences are associated with asset 76:
{
"items": [
{"values": ["18"], "count": 23},
{"values": ["76"], "count": 107}
]
}
aggregate
to cardinalityValues
will instead return the approximate number of
distinct values that are taken on by the given property among the matching sequences.
Example request:
{
"aggregate": "cardinalityValues",
"properties": [{"property": ["assetId"]}],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The result is likely exact when the set of unique values is small. In this example, there are likely two distinct asset ids among the matching sequences:
{"items": [{"count": 2}]}
aggregate
to uniqueProperties
will return the set of unique properties whose property
path begins with path
(which can currently only be ["metadata"]
) that are contained in the sequences that match the filters.
Example request:
{
"aggregate": "uniqueProperties",
"path": ["metadata"],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The result contains all the unique metadata keys in the sequences whose name
begins with
pump
, and the number of sequences that contains each metadata key:
{
"items": [
{"values": [{"property": ["metadata", "tag"]}], "count": 43},
{"values": [{"property": ["metadata", "installationDate"]}], "count": 97}
]
}
aggregate
to cardinalityProperties
will instead return the approximate number of
different property keys whose path begins with path
(which can currently only be ["metadata"]
, meaning that this can only be used to count the approximate number of distinct metadata keys among the matching sequences).
Example request:
{
"aggregate": "cardinalityProperties",
"path": ["metadata"],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}}
}
The result is likely exact when the set of unique values is small. In this example, there are likely two distinct metadata keys among the matching sequences:
{"items": [{"count": 2}]}
The aggregateFilter
field may be specified if aggregate
is set to cardinalityProperties
or uniqueProperties
. The structure of this field is similar to that of advancedFilter
, except that the set of leaf filters is smaller (in
, prefix
, and range
), and that none of the leaf filters specify a property. Unlike advancedFilter
, which is applied before the aggregation (in order to restrict the set of sequences that the aggregation operation should be applied to), aggregateFilter
is applied after the initial aggregation has been performed, in order to restrict the set of results.
aggregateFilter
.
When aggregate
is set to uniqueProperties
, the result set contains a number of property paths, each with an associated count that shows how many sequences contained that property (among those sequences that matched the filter
and advancedFilter
, if they were specified) . If aggregateFilter
is specified, it will restrict the property paths included in the output. Let us add an aggregateFilter
to the uniqueProperties
example from above:
{
"aggregate": "uniqueProperties",
"path": ["metadata"],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}},
"aggregateFilter": {"prefix": {"value": "t"}}
}
Now, the result only contains those metadata properties whose key begins with t
(but it will be the same set of metadata properties that begin with t
as in the original query without aggregateFilter
, and the counts will be the same):
{
"items": [
{"values": [{"property": ["metadata", "tag"]}], "count": 43}
]
}
Similarly, adding aggregateFilter
to cardinalityProperties
will return the approximate number of properties whose property key matches aggregateFilter
from those sequences matching the filter
and advancedFilter
(or from all sequences if neither filter
nor aggregateFilter
are specified):
{
"aggregate": "cardinalityProperties",
"path": ["metadata"],
"advancedFilter": {"prefix": {"property": ["name"], "value": "pump"}},
"aggregateFilter": {"prefix": {"value": "t"}}
}
As we saw above, only one property matches:
{"items": [{"count": 1}]}
Note that aggregateFilter
is also accepted when aggregate
is set to cardinalityValues
or cardinalityProperties
. For those aggregations, the effect of any aggregateFilter
could also be achieved via a similar advancedFilter
. However, aggregateFilter
is not accepted when aggregate
is omitted or set to count
.
Rate and concurrency limits apply this endpoint. If a request exceeds one of the limits,
it will be throttled with a 429: Too Many Requests
response. More on limit types
and how to avoid being throttled is described
here.
Limit | Per project | Per user (identity) |
---|---|---|
Rate | 15 requests per second | 10 requests per second |
Concurrency | 6 concurrent requests | 4 concurrent requests |
Aggregates the sequences that match the given criteria.
(Boolean filter (and (object) or or (object) or not (object))) or (Leaf filter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or search (object))) (TimeSeriesFilterLanguage) A filter DSL (Domain Specific Language) to define advanced filter queries. At the top level, an | |
(Boolean filter (and (object) or or (object) or not (object))) or (Leaf filter (in (object) or range (object) or prefix (object))) (TimeSeriesAggregateFilter) A filter DSL (Domain Specific Language) to define aggregate filters. | |
object (SequenceFilter) | |
aggregate | string Value: "count" The |
{- "advancedFilter": {
- "or": [
- {
- "not": {
- "and": [
- {
- "equals": {
- "property": [
- "metadata",
- "manufacturer"
], - "value": "acme"
}
}, - {
- "in": {
- "property": [
- "name"
], - "values": [
- "pump-1-temperature",
- "motor-9-temperature"
]
}
}, - {
- "range": {
- "property": [
- "dataSetId"
], - "gte": 1,
- "lt": 10
}
}
]
}
}, - {
- "and": [
- {
- "equals": {
- "property": [
- "assetId"
], - "value": 1234
}
}, - {
- "equals": {
- "property": [
- "description"
], - "value": "Temperature in Celsius"
}
}
]
}
]
}, - "aggregateFilter": {
- "and": [
- { }
]
}, - "filter": {
- "name": "string",
- "externalIdPrefix": "my.known.prefix",
- "metadata": {
- "key1": "value1",
- "key2": "value2"
}, - "assetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "rootAssetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "assetSubtreeIds": [
- {
- "id": 1234567890
}, - {
- "externalId": "externalId123"
}
], - "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "dataSetIds": [
- {
- "id": 1
}
]
}, - "aggregate": "count"
}
{- "items": [
- {
- "count": 0
}
]
}
Create one or more sequences.
Sequence to be stored.
required | Array of objects (PostSequenceDTO) [ 1 .. 1000 ] items |
{- "items": [
- {
- "name": "Any relevant name",
- "description": "Optional description",
- "assetId": 1221123111,
- "externalId": "my.known.id",
- "metadata": {
- "extracted-by": "cognite"
}, - "columns": [
- {
- "name": "depth",
- "externalId": "DPS1",
- "description": "Optional description",
- "valueType": "STRING",
- "metadata": {
- "extracted-by": "cognite"
}
}
], - "dataSetId": 1
}
]
}
{- "items": [
- {
- "id": 1,
- "name": "Any relevant name",
- "description": "Optional description",
- "assetId": 1221123111,
- "externalId": "my.known.id",
- "metadata": {
- "extracted-by": "cognite"
}, - "columns": [
- {
- "name": "depth",
- "externalId": "DPS1",
- "description": "Optional description",
- "valueType": "STRING",
- "metadata": {
- "extracted-by": "cognite"
}, - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000
}
], - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000,
- "dataSetId": 2718281828459
}
]
}
Deletes the given rows of the sequence. All columns are affected.
Indicate the sequences and the rows where data should be deleted.
required | Array of Select by Id (object) or Select by ExternalId (object) (SequenceDeleteDataRequest) [ 1 .. 1000 ] items |
{- "items": [
- {
- "rows": [
- 1
], - "id": 1
}
]
}
{ }
Deletes the sequences with the specified IDs. If one or more of the sequences do not exist, the ignoreUnknownIds
parameter controls what will happen: if it is true
, the sequences that do exist will be deleted, and the request succeeds; if it is false
or absent, nothing will be deleted, and the request fails.
Ids of the sequences to delete.
required | Array of Select by Id (object) or Select by ExternalId (object) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{ }
The advancedFilter
field lets you create complex filtering expressions that combine simple operations,
such as equals
, prefix
, and exists
, by using the Boolean operators and
, or
, and not
.
Filtering applies to basic fields as well as metadata. See the advancedFilter
syntax in the request example.
Leaf filter | Supported fields | Description and example |
---|---|---|
containsAll |
Array type fields | Only includes results which contain all of the specified values. {"containsAll": {"property": ["property"], "values": [1, 2, 3]}} |
containsAny |
Array type fields | Only includes results which contain at least one of the specified values. {"containsAny": {"property": ["property"], "values": [1, 2, 3]}} |
equals |
Non-array type fields | Only includes results that are equal to the specified value. {"equals": {"property": ["property"], "value": "example"}} |
exists |
All fields | Only includes results where the specified property exists (has a value). {"exists": {"property": ["property"]}} |
in |
Non-array type fields | Only includes results that are equal to one of the specified values. {"in": {"property": ["property"], "values": [1, 2, 3]}} |
prefix |
String type fields | Only includes results which start with the specified text. {"prefix": {"property": ["property"], "value": "example"}} |
range |
Non-array type fields | Only includes results that fall within the specified range. {"range": {"property": ["property"], "gt": 1, "lte": 5}} Supported operators: gt , lt , gte , lte |
search |
["name"] and ["description"] |
Introduced to provide functional parity with the /sequences/search endpoint. {"search": {"property": ["property"], "value": "example"}} |
Property | Type |
---|---|
["description"] |
string |
["externalId"] |
string |
["metadata", "<someCustomKey>"] |
string |
["name"] |
string |
["assetId"] |
number |
["assetRootId"] |
number |
["createdTime"] |
number |
["dataSetId"] |
number |
["id"] |
number |
["lastUpdatedTime"] |
number |
["accessCategories"] |
array of strings |
and
and or
clauses must have at least one element (and at most 99, since each element counts
towards the total clause limit, and so does the and
/or
clause itself).property
array of each leaf filter has the following limitations:property
array must match one of the existing properties (static top-level property or dynamic metadata property).containsAll
, containsAny
, and in
filter values
array size must be in the range [1, 100].containsAll
, containsAny
, and in
filter values
array must contain elements of number or string type (matching the type of the given property).range
filter must have at lest one of gt
, gte
, lt
, lte
attributes.
But gt
is mutually exclusive to gte
, while lt
is mutually exclusive to lte
.gt
, gte
, lt
, lte
in the range
filter must be of number or string type (matching the type of the given property).search
filter value
must not be blank, and the length must be in the range [1, 128], and there
may be at most two search
filters in the entire filter query.value
of a leaf filter that is applied to a string property is 256.By default, sequences are sorted by their creation time in ascending order.
Sorting by another property or by several other properties can be explicitly requested via the
sort
field, which must contain a list
of one or more sort specifications. Each sort specification indicates the property
to sort on
and, optionally, the order
in which to sort (defaults to asc
). If multiple sort specifications are
supplied, the results are sorted on the first property, and those with the same value for the first
property are sorted on the second property, and so on.
Partitioning is done independently of sorting; there is no guarantee of sort order between elements from different partitions.
In case the nulls
field has the auto
value, or the field isn't specified, null (missing) values
are considered bigger than any other values. They are placed last when sorting in the asc
order and first in the desc
order. Otherwise, missing values are placed according to
the nulls
field (last
or first
), and their placement won't depend on the order
field. Note that the number zero, empty strings, and empty lists are all considered
not null.
{
"sort": [
{
"property" : ["createdTime"],
"order": "desc",
"nulls": "last"
},
{
"property" : ["metadata", "<someCustomKey>"]
}
]
}
You can sort on the following properties:
Property |
---|
["assetId"] |
["createdTime"] |
["dataSetId"] |
["description"] |
["externalId"] |
["lastUpdatedTime"] |
["metadata", "<someCustomKey>"] |
["name"] |
The sort
array must contain 1 to 2 elements.
Retrieves a list of sequences matching the given criteria.
object (SequenceFilter) | |
(Boolean filter (and (object) or or (object) or not (object))) or (Leaf filter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or search (object))) (TimeSeriesFilterLanguage) A filter DSL (Domain Specific Language) to define advanced filter queries. At the top level, an | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Returns up to this many results per page. |
cursor | string |
partition | string (Partition) Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
Array of objects (TimeSeriesSortItem) [ 1 .. 2 ] items Sort by array of selected properties. |
{- "filter": {
- "name": "string",
- "externalIdPrefix": "my.known.prefix",
- "metadata": {
- "key1": "value1",
- "key2": "value2"
}, - "assetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "rootAssetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "assetSubtreeIds": [
- {
- "id": 1234567890
}, - {
- "externalId": "externalId123"
}
], - "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "dataSetIds": [
- {
- "id": 1
}
]
}, - "advancedFilter": {
- "or": [
- {
- "not": {
- "and": [
- {
- "equals": {
- "property": [
- "metadata",
- "manufacturer"
], - "value": "acme"
}
}, - {
- "in": {
- "property": [
- "name"
], - "values": [
- "pump-1-temperature",
- "motor-9-temperature"
]
}
}, - {
- "range": {
- "property": [
- "dataSetId"
], - "gte": 1,
- "lt": 10
}
}
]
}
}, - {
- "and": [
- {
- "equals": {
- "property": [
- "assetId"
], - "value": 1234
}
}, - {
- "equals": {
- "property": [
- "description"
], - "value": "Temperature in Celsius"
}
}
]
}
]
}, - "limit": 100,
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo",
- "partition": "1/10",
- "sort": [
- {
- "property": [
- "string"
], - "order": "asc",
- "nulls": "first"
}
]
}
{- "items": [
- {
- "id": 1,
- "name": "Any relevant name",
- "description": "Optional description",
- "assetId": 1221123111,
- "externalId": "my.known.id",
- "metadata": {
- "extracted-by": "cognite"
}, - "columns": [
- {
- "name": "depth",
- "externalId": "DPS1",
- "description": "Optional description",
- "valueType": "STRING",
- "metadata": {
- "extracted-by": "cognite"
}, - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000
}
], - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000,
- "dataSetId": 2718281828459
}
], - "nextCursor": "string"
}
Inserts rows into a sequence. This overwrites data in rows and columns that exist.
Data posted.
required | Array of Select by Id (object) or Select by ExternalId (object) (SequencePostData) [ 1 .. 1000 ] items |
{- "items": [
- {
- "externalId": "DL/DRILL412/20190103/T3",
- "columns": [
- "Depth",
- "DepthSource",
- "PowerSetting"
], - "rows": [
- {
- "rowNumber": 1,
- "values": [
- 23331.3,
- "s2",
- 61
]
}
]
}
]
}
{ }
List sequences. Use nextCursor
to paginate through the results.
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
partition | string Example: partition=1/10 Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
limit | integer [ 1 .. 1000 ] Default: 25 Limits the number of results to be returned. The server returns a maximum of 1000 results even if you specify a higher limit. |
const sequences = await client.sequences.list({ filter: { name: 'sequence_name' } });
{- "items": [
- {
- "id": 1,
- "name": "Any relevant name",
- "description": "Optional description",
- "assetId": 1221123111,
- "externalId": "my.known.id",
- "metadata": {
- "extracted-by": "cognite"
}, - "columns": [
- {
- "name": "depth",
- "externalId": "DPS1",
- "description": "Optional description",
- "valueType": "STRING",
- "metadata": {
- "extracted-by": "cognite"
}, - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000
}
], - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000,
- "dataSetId": 2718281828459
}
], - "nextCursor": "string"
}
Retrieves the last row in one or more sequences. Note that the last row in a sequence is the one with the highest row number, which is not necessarily the one that was ingested most recently.
Description of data requested.
columns | Array of strings [ 1 .. 400 ] items Columns to include. Specified as a list of the |
before | integer <int64> >= 1 Get rows up to, but not including, this row number. |
id required | integer <int64> (CogniteInternalId) [ 1 .. 9007199254740991 ] A server-generated ID for the object. |
{- "columns": [
- "string"
], - "before": 1,
- "id": 1
}
{- "id": 1112,
- "externalId": "DL/DRILL412/20190103/T3",
- "columns": [
- {
- "externalId": "Depth"
}, - {
- "externalId": "DepthSource"
}, - {
- "externalId": "PowerSetting"
}
], - "rows": [
- {
- "rowNumber": 1,
- "values": [
- 23331.3,
- "s2",
- 61
]
}
]
}
Processes data requests and returns the result. Note that this operation uses a dynamic limit on the number of rows returned based on the number and type of columns; use the provided cursor to paginate and retrieve all data.
Description of data requested.
start | integer <int64> Default: 0 Lowest row number included. |
end | integer <int64> Get rows up to, but excluding, this row number. Default - No limit. |
limit | integer <int32> [ 1 .. 10000 ] Default: 100 Maximum number of rows returned in one request. API might return less even if there's more data, but it will provide a cursor for continuation. If there's more data beyond this limit, a cursor will be returned to simplify further fetching of data. |
cursor | string Cursor for pagination returned from a previous request. Apart from this cursor, the rest of the request object is the same as for the original request. |
columns | Array of strings [ 1 .. 400 ] items Columns to include. Specified as a list of the |
id required | integer <int64> (CogniteInternalId) [ 1 .. 9007199254740991 ] A server-generated ID for the object. |
{- "start": 0,
- "end": 1,
- "limit": 1,
- "cursor": "string",
- "columns": [
- "string"
], - "id": 1
}
{- "id": 1112,
- "externalId": "DL/DRILL412/20190103/T3",
- "columns": [
- {
- "externalId": "Depth"
}, - {
- "externalId": "DepthSource"
}, - {
- "externalId": "PowerSetting"
}
], - "rows": [
- {
- "rowNumber": 1,
- "values": [
- 23331.3,
- "s2",
- 61
]
}
], - "nextCursor": "string"
}
Retrieves one or more sequences by ID or external ID. The response returns the sequences in the same order as in the request.
Ids of the sequences
required | Array of Select by Id (object) or Select by ExternalId (object) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "id": 1,
- "name": "Any relevant name",
- "description": "Optional description",
- "assetId": 1221123111,
- "externalId": "my.known.id",
- "metadata": {
- "extracted-by": "cognite"
}, - "columns": [
- {
- "name": "depth",
- "externalId": "DPS1",
- "description": "Optional description",
- "valueType": "STRING",
- "metadata": {
- "extracted-by": "cognite"
}, - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000
}
], - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000,
- "dataSetId": 2718281828459
}
]
}
Retrieves a list of sequences matching the given criteria. This operation doesn't support pagination.
Retrieves a list of sequences matching the given criteria. This operation doesn't support pagination.
object (SequenceFilter) | |
object (SequenceSearch) | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Returns up to this many results. |
{- "filter": {
- "name": "string",
- "externalIdPrefix": "my.known.prefix",
- "metadata": {
- "key1": "value1",
- "key2": "value2"
}, - "assetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "rootAssetIds": [
- 363848954441724,
- 793045462540095,
- 1261042166839739
], - "assetSubtreeIds": [
- {
- "id": 1234567890
}, - {
- "externalId": "externalId123"
}
], - "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "dataSetIds": [
- {
- "id": 1
}
]
}, - "search": {
- "name": "string",
- "description": "string",
- "query": "string"
}, - "limit": 100
}
{- "items": [
- {
- "id": 1,
- "name": "Any relevant name",
- "description": "Optional description",
- "assetId": 1221123111,
- "externalId": "my.known.id",
- "metadata": {
- "extracted-by": "cognite"
}, - "columns": [
- {
- "name": "depth",
- "externalId": "DPS1",
- "description": "Optional description",
- "valueType": "STRING",
- "metadata": {
- "extracted-by": "cognite"
}, - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000
}
], - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000,
- "dataSetId": 2718281828459
}
]
}
Updates one or more sequences. Fields outside of the request remain unchanged.
Patch definition
required | Array of Select by Id (object) or Select by ExternalId (object) (SequencesUpdate) [ 1 .. 1000 ] items |
{- "items": [
- {
- "update": {
- "name": {
- "set": "string"
}, - "description": {
- "set": "string"
}, - "assetId": {
- "set": 0
}, - "externalId": {
- "set": "string"
}, - "metadata": {
- "set": {
- "key1": "value1",
- "key2": "value2"
}
}, - "dataSetId": {
- "set": 0
}, - "columns": {
- "modify": [
- {
- "externalId": "my.known.id",
- "update": {
- "description": {
- "set": "string"
}, - "externalId": {
- "set": "string"
}, - "name": {
- "set": "string"
}, - "metadata": {
- "set": {
- "key1": "value1",
- "key2": "value2"
}
}
}
}
], - "add": [
- {
- "name": "depth",
- "externalId": "DPS1",
- "description": "Optional description",
- "valueType": "STRING",
- "metadata": {
- "extracted-by": "cognite"
}
}
], - "remove": [
- {
- "externalId": "my.known.id"
}
]
}
}, - "id": 1
}
]
}
{- "items": [
- {
- "id": 1,
- "name": "Any relevant name",
- "description": "Optional description",
- "assetId": 1221123111,
- "externalId": "my.known.id",
- "metadata": {
- "extracted-by": "cognite"
}, - "columns": [
- {
- "name": "depth",
- "externalId": "DPS1",
- "description": "Optional description",
- "valueType": "STRING",
- "metadata": {
- "extracted-by": "cognite"
}, - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000
}
], - "createdTime": 100000000000,
- "lastUpdatedTime": 100000000000,
- "dataSetId": 2718281828459
}
]
}
The Geospatial API allows to model a problem domain when data has a geometric or geographic nature. The geospatial data is organized in feature types that are homogeneous collections of features (geospatial items), each having the same spatial representation, such as points, lines, or polygons, and a common set of typed properties. The Geospatial API is aware of Coordinate Reference Systems, and allows transformations. To learn more about geospatial concepts, see the concept guide.
Search for features based on the feature property filter and perform requested aggregations on a given property. Aggregations are supported for all filters that do not contain stWithin
, stWithinProperly
, stContains
and stContainsProperly
search in 3D geometries.
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
allowDimensionalityMismatch | boolean (GeospatialAllowDimensionalityMismatch) Optional parameter indicating if the spatial filter operators allow input geometries with a different dimensionality than the properties they are applied to. For instance, when set to true, if a feature type has a property of type POLYGONZM (4D), its features can be filtered using the |
GeospatialFeatureNotFilter (object) or GeospatialFeatureAndFilter (object) or GeospatialFeatureOrFilter (object) or GeospatialFeatureEqualsFilter (object) or GeospatialFeatureMissingFilter (object) or GeospatialFeatureLikeFilter (object) or GeospatialFeatureRegexFilter (object) or GeospatialFeatureRangeFilter (object) or GeospatialFeatureContainsAnyFilter (object) or GeospatialFeatureInFilter (object) or GeospatialFeatureStIntersectsFilter (object) or GeospatialFeatureStIntersects3dFilter (object) or GeospatialFeatureStWithinFilter (object) or GeospatialFeatureStWithinProperlyFilter (object) or GeospatialFeatureStContainsFilter (object) or GeospatialFeatureStContainsProperlyFilter (object) or GeospatialFeatureStWithinDistanceFilter (object) or GeospatialFeatureStWithinDistance3dFilter (object) (GeospatialFeatureFilter) | |
aggregates | Array of strings Deprecated Items Enum: "avg" "count" "max" "min" "stCentroid" "stCollect" "stConvexHull" "stIntersection" "stUnion" "sum" "variance" This parameter is deprecated. Use |
property | string Deprecated This parameter is deprecated. Use |
outputSrid | integer (GeospatialReferenceId) [ 0 .. 1000000 ] EPSG code, e.g. 4326. Only valid for geometry types. See https://en.wikipedia.org/wiki/Spatial_reference_system |
groupBy | Array of strings names of properties to be used for grouping by |
sort | Array of strings Sort result by the selected fields (properties or aggregates). Default sort order is ascending if not specified. Available sort direction: ASC, DESC, ASC_NULLS_FIRST, DESC_NULLS_FIRST, ASC_NULLS_LAST, DESC_NULLS_LAST. |
object (GeospatialAggregateOutput) A list of aggregations which are requested. |
{- "filter": {
- "and": [
- {
- "range": {
- "property": "temperature",
- "gt": 4.54
}
}, - {
- "stWithin": {
- "property": "location",
- "value": {
- "wkt": "POLYGON((60.547602 -5.423433, 60.547602 -6.474416, 60.585858 -6.474416, 60.585858 -5.423433, 60.547602 -5.423433))"
}
}
}
]
}, - "groupBy": [
- "category"
], - "sort": [
- "average:ASC",
- "category:DESC"
], - "output": {
- "min_temperature": {
- "min": {
- "property": "temperature"
}
}, - "max_speed": {
- "max": {
- "property": "speed"
}
}
}
}
{- "items": [
- {
- "category": "first category",
- "max": 12.3,
- "min": 0.5,
- "average": 5.32
}, - {
- "category": "second category",
- "max": 14.3,
- "min": 0.7,
- "average": 8.32
}
]
}
Compute custom json output structures or well known binary format responses based on calculation or selection of feature properties or direct values given in the request.
required | object |
{- "output": {
- "value": {
- "stTransform": {
- "geometry": {
- "ewkt": "SRID=4326;POLYGON((0 0,10 0,10 10,0 10,0 0))"
}, - "srid": 23031
}
}
}
}
{- "items": [
- {
- "value": {
- "wkt": "POLYGON((0 0,10.5 0,10.5 10.5,0 10.5,0 0))",
- "srid": 23031
}
}
]
}
Creates custom Coordinate Reference Systems.
List of custom Coordinate Reference Systems to be created.
required | Array of objects (GeospatialCoordinateReferenceSystem) |
{- "items": [
- {
- "srid": 456789,
- "wkt": "GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.0174532925199433,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]]",
- "projString": "+proj=longlat +datum=WGS84 +no_defs \"\""
}
]
}
{- "items": [
- {
- "srid": 4326,
- "wkt": "POINT (0 0)",
- "projString": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Creates feature types. Each tenant can have up to 100 feature types.
List of feature types to be created. It is possible to create up to 100 feature types in one request provided the total number of feature types on the tenant will not exceed 100.
required | Array of objects (GeospatialFeatureTypeSpec) [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "ocean_temperature",
- "properties": {
- "temperature": {
- "type": "DOUBLE"
}, - "location": {
- "type": "POINT",
- "srid": 4326
}
}, - "searchSpec": {
- "location_idx": {
- "properties": [
- "location"
]
}
}
}
]
}
{- "items": [
- {
- "externalId": "ocean_temperature",
- "dataSetId": 1278028,
- "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505,
- "properties": {
- "temperature": {
- "type": "DOUBLE"
}, - "location": {
- "type": "POINT",
- "srid": 4326
}, - "createdTime": {
- "type": "LONG"
}, - "lastUpdatedTime": {
- "type": "LONG"
}, - "externalId": {
- "type": "STRING",
- "size": 32
}
}, - "searchSpec": {
- "location_idx": {
- "properties": [
- "location"
]
}, - "createdTimeIdx": {
- "properties": [
- "createdTime"
]
}, - "lastUpdatedTimeIdx": {
- "properties": [
- "lastUpdatedTime"
]
}, - "externalIdIdx": {
- "properties": [
- "externalId"
]
}
}
}
]
}
Create features
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
allowCrsTransformation | boolean (GeospatialAllowCrsTransformation) Optional parameter indicating if input geometry properties should be transformed into the respective Coordinate Reference Systems defined in the feature type specification. If the parameter is true, then input geometries will be transformed when the input and output Coordinate Reference Systems differ. When it is false, then requests with geometries in Coordinate Reference System different from the ones defined in the feature type will result in bad request response code. Transformations apply to property geometries in case of create and update feature, as well as to geometries in spatial filters in search endpoints. |
required | Array of objects (GeospatialFeatureSpec) [ 1 .. 1000 ] items [ items <= 200 properties ] |
{- "items": [
- {
- "externalId": "measurement_point_765",
- "temperature": 5.65,
- "location": {
- "wkt": "POINT(60.547602 -5.423433)"
}
}, - {
- "externalId": "measurement_point_863",
- "temperature": 5.03,
- "location": {
- "wkt": "POINT(60.585858 -6.474416)"
}
}
]
}
{- "items": [
- {
- "externalId": "measurement_point_765",
- "temperature": 5.65,
- "location": {
- "wkt": "POINT(60.547602 -5.423433)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}, - {
- "externalId": "measurement_point_863",
- "temperature": 5.03,
- "location": {
- "wkt": "POINT(60.585858 -6.474416)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}
]
}
Delete a raster from a feature property. If there is no raster already, the operation is a no-op.
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
featureExternalId required | string <= 256 characters ^[A-Za-z][A-Za-z0-9_]{0,255}$ Example: ocean_measure_W87H62 External Id of the type provided by client. Must be unique among all feature external ids within a CDF project and feature type. |
rasterPropertyName required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: bathymetry Raster Id of the raster property provided by client. Must be unique among all feature property names within a feature type. |
{ }
Delete custom Coordinate Reference Systems.
List of custom Coordinate Reference Systems to be deleted.
required | Array of objects [ 1 .. 1000 ] items |
{- "items": [
- {
- "srid": 4326
}
]
}
{ }
Delete feature types.
List of feature types to be deleted. It is possible to delete a maximum of 10 feature types per request. Feature types must not have related features. Feature types with related features can be deleted using force flag.
recursive | boolean (GeospatialRecursiveDelete) Indicates if feature types should be deleted together with all related features. Optional parameter, defaults to false. |
required | Array of objects [ 1 .. 10 ] items |
{- "items": [
- {
- "externalId": "ocean_temperature"
}
]
}
{ }
Delete features.
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
List of features to be deleted. It is possible to post a maximum of 1000 items per request.
required | Array of objects (GeospatialItemExternalId) [ 1 .. 1000 ] items |
{- "items": [
- {
- "externalId": "measurement_point_765"
}, - {
- "externalId": "measurement_point_863"
}
]
}
{ }
List features based on the feature property filter passed in the body of the request. This operation supports pagination by cursor.
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
allowCrsTransformation | boolean (GeospatialAllowCrsTransformation) Optional parameter indicating if input geometry properties should be transformed into the respective Coordinate Reference Systems defined in the feature type specification. If the parameter is true, then input geometries will be transformed when the input and output Coordinate Reference Systems differ. When it is false, then requests with geometries in Coordinate Reference System different from the ones defined in the feature type will result in bad request response code. Transformations apply to property geometries in case of create and update feature, as well as to geometries in spatial filters in search endpoints. |
allowDimensionalityMismatch | boolean (GeospatialAllowDimensionalityMismatch) Optional parameter indicating if the spatial filter operators allow input geometries with a different dimensionality than the properties they are applied to. For instance, when set to true, if a feature type has a property of type POLYGONZM (4D), its features can be filtered using the |
GeospatialFeatureNotFilter (object) or GeospatialFeatureAndFilter (object) or GeospatialFeatureOrFilter (object) or GeospatialFeatureEqualsFilter (object) or GeospatialFeatureMissingFilter (object) or GeospatialFeatureLikeFilter (object) or GeospatialFeatureRegexFilter (object) or GeospatialFeatureRangeFilter (object) or GeospatialFeatureContainsAnyFilter (object) or GeospatialFeatureInFilter (object) or GeospatialFeatureStIntersectsFilter (object) or GeospatialFeatureStIntersects3dFilter (object) or GeospatialFeatureStWithinFilter (object) or GeospatialFeatureStWithinProperlyFilter (object) or GeospatialFeatureStContainsFilter (object) or GeospatialFeatureStContainsProperlyFilter (object) or GeospatialFeatureStWithinDistanceFilter (object) or GeospatialFeatureStWithinDistance3dFilter (object) (GeospatialFeatureFilter) | |
limit | integer (SearchLimit) [ 1 .. 1000 ] Default: 1000 Limits the number of results to be returned. |
object (GeospatialOutput) Desired output specification. | |
cursor | string |
{- "filter": {
- "and": [
- {
- "range": {
- "property": "temperature",
- "gt": 4.54
}
}, - {
- "stWithin": {
- "property": "location",
- "value": {
- "wkt": "POLYGON((60.547602 -5.423433, 60.547602 -6.474416, 60.585858 -6.474416, 60.585858 -5.423433, 60.547602 -5.423433))"
}
}
}
]
}, - "limit": 100,
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo"
}
{- "items": [
- {
- "externalId": "measurement_point_765",
- "temperature": 5.65,
- "location": {
- "wkt": "POINT(60.547602 -5.423433)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}, - {
- "externalId": "measurement_point_863",
- "temperature": 5.03,
- "location": {
- "wkt": "POINT(60.585858 -6.474416)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}
], - "nextCursor": "pwGTFXeL-JiWO8CZpgP23g"
}
Get a raster from a feature property. The feature property must be of type RASTER.
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
featureExternalId required | string <= 256 characters ^[A-Za-z][A-Za-z0-9_]{0,255}$ Example: ocean_measure_W87H62 External Id of the type provided by client. Must be unique among all feature external ids within a CDF project and feature type. |
rasterPropertyName required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: bathymetry Raster Id of the raster property provided by client. Must be unique among all feature property names within a feature type. |
srid | integer (GeospatialReferenceId) [ 0 .. 1000000 ] EPSG code, e.g. 4326. Only valid for geometry types. See https://en.wikipedia.org/wiki/Spatial_reference_system |
scaleX | number <double> |
scaleY | number <double> |
format required | string Value: "XYZ" |
object |
{- "srid": 4326,
- "scaleX": 0,
- "scaleY": 0,
- "format": "GTiff",
- "options": {
- "JPEG_QUALITY": 1
}
}
{- "error": {
- "code": 401,
- "message": "Could not authenticate.",
- "missing": [
- { }
], - "duplicated": [
- { }
], - "invalid": [
- { }
], - "dependencies": [
- { }
]
}
}
Get Coordinate Reference Systems by their Spatial Reference IDs
required | Array of objects [ 1 .. 1000 ] items |
{- "items": [
- {
- "srid": 4326
}
]
}
{- "items": [
- {
- "srid": 4326,
- "wkt": "GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.0174532925199433,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]]",
- "projString": "+proj=longlat +datum=WGS84 +no_defs \"\"",
- "createdTime": 1633596134000,
- "lastUpdatedTime": 1633596134000
}
]
}
Get features with paging support
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
limit | integer (SearchLimit) [ 1 .. 1000 ] Default: 1000 Limits the number of results to be returned. |
allowCrsTransformation | boolean (GeospatialAllowCrsTransformation) Example: allowCrsTransformation=true Optional parameter indicating if input geometry properties should be transformed into the respective Coordinate Reference Systems defined in the feature type specification. If the parameter is true, then input geometries will be transformed when the input and output Coordinate Reference Systems differ. When it is false, then requests with geometries in Coordinate Reference System different from the ones defined in the feature type will result in bad request response code. Transformations apply to property geometries in case of create and update feature, as well as to geometries in spatial filters in search endpoints. |
allowDimensionalityMismatch | boolean (GeospatialAllowDimensionalityMismatch) Example: allowDimensionalityMismatch=true Optional parameter indicating if the spatial filter operators allow input geometries with a different dimensionality than the properties they are applied to. For instance, when set to true, if a feature type has a property of type POLYGONZM (4D), its features can be filtered using the |
{- "items": [
- {
- "externalId": "measurement_point_765",
- "temperature": 5.65,
- "location": {
- "wkt": "POINT(60.547602 -5.423433)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}, - {
- "externalId": "measurement_point_863",
- "temperature": 5.03,
- "location": {
- "wkt": "POINT(60.585858 -6.474416)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}
], - "nextCursor": "pwGTFXeL-JiWO8CZpgP23g"
}
List the defined Coordinate Reference Systems. The list can be limited to the custom Coordinate Reference Systems defined for the tenant.
filterOnlyCustom | boolean Example: filterOnlyCustom=true Optional parameter to only list custom Coordinate Reference Systems. Defaults to false. |
{- "items": [
- {
- "srid": 4326,
- "wkt": "GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.0174532925199433,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]]",
- "projString": "+proj=longlat +datum=WGS84 +no_defs \"\"",
- "createdTime": 1633596134000,
- "lastUpdatedTime": 1633596134000
}
]
}
List all feature types
{- "items": [
- {
- "externalId": "ocean_temperature",
- "dataSetId": 1278028,
- "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505,
- "properties": {
- "temperature": {
- "type": "DOUBLE"
}, - "location": {
- "type": "POINT",
- "srid": 4326
}, - "createdTime": {
- "type": "LONG"
}, - "lastUpdatedTime": {
- "type": "LONG"
}, - "externalId": {
- "type": "STRING",
- "size": 32
}
}, - "searchSpec": {
- "location_idx": {
- "properties": [
- "location"
]
}, - "createdTimeIdx": {
- "properties": [
- "createdTime"
]
}, - "lastUpdatedTimeIdx": {
- "properties": [
- "lastUpdatedTime"
]
}, - "externalIdIdx": {
- "properties": [
- "externalId"
]
}
}
}
]
}
Put a raster into a feature property. The feature property must be of type RASTER.
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
featureExternalId required | string <= 256 characters ^[A-Za-z][A-Za-z0-9_]{0,255}$ Example: ocean_measure_W87H62 External Id of the type provided by client. Must be unique among all feature external ids within a CDF project and feature type. |
rasterPropertyName required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: bathymetry Raster Id of the raster property provided by client. Must be unique among all feature property names within a feature type. |
srid required | integer [ 1 .. 999999 ] Example: srid=3857 mandatory parameter that specifies the SRID of the coordinate reference system of a raster. |
format required | string Value: "XYZ" Example: format=XYZ mandatory parameter that specifies the format of the input raster. |
scaleX | number <double> Example: scaleX=2 optional parameter that specifies the pixel scale x in storage. If not specified, the pixel scale remains the same as the input raster. |
scaleY | number <double> Example: scaleY=2 optional parameter that specifies the pixel scale y in storage. If not specified, the pixel scale remains the same as the input raster. |
allowCrsTransformation | boolean Example: allowCrsTransformation=true Optional parameter indicating if the input raster coordinates should be transformed into the Coordinate Reference Systems defined for the raster property in the feature type specification. The transformation will typically impact the pixel values. When the parameter is false, requests with rasters in Coordinate Reference System different from the ones defined in the feature type will result in bad request response code. |
{- "srid": 3857,
- "width": 4,
- "height": 5,
- "numBands": 1,
- "scaleX": 1,
- "scaleY": 1,
- "skewX": 0,
- "skewY": 0,
- "upperLeftX": -0.5,
- "upperLeftY": -0.5
}
Retrieves feature types by external ids. It is possible to retrieve up to 100 items per request, i.e. the maximum number of feature types for a tenant.
required | Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "ocean_temperature"
}
]
}
{- "items": [
- {
- "externalId": "ocean_temperature",
- "dataSetId": 1278028,
- "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505,
- "properties": {
- "temperature": {
- "type": "DOUBLE"
}, - "location": {
- "type": "POINT",
- "srid": 4326
}, - "createdTime": {
- "type": "LONG"
}, - "lastUpdatedTime": {
- "type": "LONG"
}, - "externalId": {
- "type": "STRING",
- "size": 32
}
}, - "searchSpec": {
- "location_idx": {
- "properties": [
- "location"
]
}, - "createdTimeIdx": {
- "properties": [
- "createdTime"
]
}, - "lastUpdatedTimeIdx": {
- "properties": [
- "lastUpdatedTime"
]
}, - "externalIdIdx": {
- "properties": [
- "externalId"
]
}
}
}
]
}
Retrieves features by external ids. It is possible to retrieve up to 1000 items per request.
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
required | Array of objects (GeospatialItemExternalId) [ 1 .. 1000 ] items |
object (GeospatialOutput) Desired output specification. |
{- "items": [
- {
- "externalId": "measurement_point_765"
}, - {
- "externalId": "measurement_point_863"
}
]
}
{- "items": [
- {
- "externalId": "measurement_point_765",
- "temperature": 5.65,
- "location": {
- "wkt": "POINT(60.547602 -5.423433)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}, - {
- "externalId": "measurement_point_863",
- "temperature": 5.03,
- "location": {
- "wkt": "POINT(60.585858 -6.474416)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}
]
}
Search for features based on the feature property filter passed in the body of the request. The streaming response format can be length prefixed, new line delimited, record separator delimited or concatenated depending on requested output (see https://en.wikipedia.org/wiki/JSON_streaming).
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
allowCrsTransformation | boolean (GeospatialAllowCrsTransformation) Optional parameter indicating if input geometry properties should be transformed into the respective Coordinate Reference Systems defined in the feature type specification. If the parameter is true, then input geometries will be transformed when the input and output Coordinate Reference Systems differ. When it is false, then requests with geometries in Coordinate Reference System different from the ones defined in the feature type will result in bad request response code. Transformations apply to property geometries in case of create and update feature, as well as to geometries in spatial filters in search endpoints. |
allowDimensionalityMismatch | boolean (GeospatialAllowDimensionalityMismatch) Optional parameter indicating if the spatial filter operators allow input geometries with a different dimensionality than the properties they are applied to. For instance, when set to true, if a feature type has a property of type POLYGONZM (4D), its features can be filtered using the |
GeospatialFeatureNotFilter (object) or GeospatialFeatureAndFilter (object) or GeospatialFeatureOrFilter (object) or GeospatialFeatureEqualsFilter (object) or GeospatialFeatureMissingFilter (object) or GeospatialFeatureLikeFilter (object) or GeospatialFeatureRegexFilter (object) or GeospatialFeatureRangeFilter (object) or GeospatialFeatureContainsAnyFilter (object) or GeospatialFeatureInFilter (object) or GeospatialFeatureStIntersectsFilter (object) or GeospatialFeatureStIntersects3dFilter (object) or GeospatialFeatureStWithinFilter (object) or GeospatialFeatureStWithinProperlyFilter (object) or GeospatialFeatureStContainsFilter (object) or GeospatialFeatureStContainsProperlyFilter (object) or GeospatialFeatureStWithinDistanceFilter (object) or GeospatialFeatureStWithinDistance3dFilter (object) (GeospatialFeatureFilter) | |
limit | number |
object (GeospatialOutputStreaming) Desired output specification for streaming. |
{- "filter": {
- "and": [
- {
- "range": {
- "property": "temperature",
- "gt": 4.54
}
}, - {
- "stWithin": {
- "property": "location",
- "value": {
- "wkt": "POLYGON((60.547602 -5.423433, 60.547602 -6.474416, 60.585858 -6.474416, 60.585858 -5.423433, 60.547602 -5.423433))"
}
}
}
]
}, - "limit": 100,
- "output": {
- "jsonStreamFormat": "NEW_LINE_DELIMITED"
}
}
{- "property1": "string",
- "property2": "string",
- "externalId": "my.known.id",
- "createdTime": 0,
- "lastUpdatedTime": 0
}
Search for features based on the feature property filter passed in the body of the request. The result of the search is limited to a maximum of 1000 items. Results in excess of the limit are truncated. This means that the complete result set of the search cannot be retrieved with this method. However, for a given unmodified feature collection, the result of the search is deterministic and does not change over time.
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
allowCrsTransformation | boolean (GeospatialAllowCrsTransformation) Optional parameter indicating if input geometry properties should be transformed into the respective Coordinate Reference Systems defined in the feature type specification. If the parameter is true, then input geometries will be transformed when the input and output Coordinate Reference Systems differ. When it is false, then requests with geometries in Coordinate Reference System different from the ones defined in the feature type will result in bad request response code. Transformations apply to property geometries in case of create and update feature, as well as to geometries in spatial filters in search endpoints. |
allowDimensionalityMismatch | boolean (GeospatialAllowDimensionalityMismatch) Optional parameter indicating if the spatial filter operators allow input geometries with a different dimensionality than the properties they are applied to. For instance, when set to true, if a feature type has a property of type POLYGONZM (4D), its features can be filtered using the |
GeospatialFeatureNotFilter (object) or GeospatialFeatureAndFilter (object) or GeospatialFeatureOrFilter (object) or GeospatialFeatureEqualsFilter (object) or GeospatialFeatureMissingFilter (object) or GeospatialFeatureLikeFilter (object) or GeospatialFeatureRegexFilter (object) or GeospatialFeatureRangeFilter (object) or GeospatialFeatureContainsAnyFilter (object) or GeospatialFeatureInFilter (object) or GeospatialFeatureStIntersectsFilter (object) or GeospatialFeatureStIntersects3dFilter (object) or GeospatialFeatureStWithinFilter (object) or GeospatialFeatureStWithinProperlyFilter (object) or GeospatialFeatureStContainsFilter (object) or GeospatialFeatureStContainsProperlyFilter (object) or GeospatialFeatureStWithinDistanceFilter (object) or GeospatialFeatureStWithinDistance3dFilter (object) (GeospatialFeatureFilter) | |
limit | integer (SearchLimit) [ 1 .. 1000 ] Default: 1000 Limits the number of results to be returned. |
object (GeospatialOutput) Desired output specification. | |
sort | Array of strings Sort result by selected fields. Syntax: sort:["field_1","field_2:ASC","field_3:DESC"]. Default sort order is ascending if not specified. Available sort direction: ASC, DESC, ASC_NULLS_FIRST, DESC_NULLS_FIRST, ASC_NULLS_LAST, DESC_NULLS_LAST. |
{- "filter": {
- "and": [
- {
- "range": {
- "property": "temperature",
- "gt": 4.54
}
}, - {
- "stWithin": {
- "property": "location",
- "value": {
- "wkt": "POLYGON((60.547602 -5.423433, 60.547602 -6.474416, 60.585858 -6.474416, 60.585858 -5.423433, 60.547602 -5.423433))"
}
}
}
]
}, - "limit": 100,
- "sort": [
- "temperature:ASC",
- "location"
]
}
{- "items": [
- {
- "externalId": "measurement_point_765",
- "temperature": 5.65,
- "location": {
- "wkt": "POINT(60.547602 -5.423433)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}, - {
- "externalId": "measurement_point_863",
- "temperature": 5.03,
- "location": {
- "wkt": "POINT(60.585858 -6.474416)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}
]
}
Update one or more feature types
List of feature types to be updated. It is possible to add and remove properties and indexes. WARNING: removing properties will result in data loss in corresponding features.
required | Array of objects (GeospatialUpdateFeatureTypeSpec) [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "ocean_temperature",
- "update": {
- "properties": {
- "add": {
- "depth": {
- "type": "DOUBLE"
}
}
}, - "searchSpec": {
- "add": {
- "depth_idx": {
- "properties": [
- "depth"
]
}
}
}
}
}
]
}
{- "items": [
- {
- "externalId": "ocean_temperature",
- "dataSetId": 1278028,
- "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505,
- "properties": {
- "temperature": {
- "type": "DOUBLE"
}, - "location": {
- "type": "POINT",
- "srid": 4326
}, - "createdTime": {
- "type": "LONG"
}, - "lastUpdatedTime": {
- "type": "LONG"
}, - "externalId": {
- "type": "STRING",
- "size": 32
}
}, - "searchSpec": {
- "location_idx": {
- "properties": [
- "location"
]
}, - "createdTimeIdx": {
- "properties": [
- "createdTime"
]
}, - "lastUpdatedTimeIdx": {
- "properties": [
- "lastUpdatedTime"
]
}, - "externalIdIdx": {
- "properties": [
- "externalId"
]
}
}
}
]
}
Update features. This is a replace operation, i.e., all feature properties have to be sent in the request body even if their values do not change.
featureTypeExternalId required | string <= 32 characters ^[A-Za-z][A-Za-z0-9_]{0,31}$ Example: ocean_measures External Id of the feature type provided by client. Must be unique among all feature type external ids within a CDF project. |
List of features to update.
allowCrsTransformation | boolean (GeospatialAllowCrsTransformation) Optional parameter indicating if input geometry properties should be transformed into the respective Coordinate Reference Systems defined in the feature type specification. If the parameter is true, then input geometries will be transformed when the input and output Coordinate Reference Systems differ. When it is false, then requests with geometries in Coordinate Reference System different from the ones defined in the feature type will result in bad request response code. Transformations apply to property geometries in case of create and update feature, as well as to geometries in spatial filters in search endpoints. |
required | Array of objects (GeospatialFeatureSpec) [ 1 .. 1000 ] items [ items <= 200 properties ] |
{- "items": [
- {
- "externalId": "measurement_point_765",
- "temperature": 5.65,
- "location": {
- "wkt": "POINT(60.547602 -5.423433)"
}
}, - {
- "externalId": "measurement_point_863",
- "temperature": 5.03,
- "location": {
- "wkt": "POINT(60.585858 -6.474416)"
}
}
]
}
{- "items": [
- {
- "externalId": "measurement_point_765",
- "temperature": 5.65,
- "location": {
- "wkt": "POINT(60.547602 -5.423433)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}, - {
- "externalId": "measurement_point_863",
- "temperature": 5.03,
- "location": {
- "wkt": "POINT(60.585858 -6.474416)"
}, - "createdTime": 1629784673505,
- "lastUpdatedTime": 1629784673505
}
]
}
A seismic object is a no-copy view into seismic stores. Once you have defined the object, either via a polygon to "cut out" from the origin seismic store or via an explicit trace-by-trace mapping, you cannot modify it. You can assign seismic objects to partitions and restrict user access to each partition. That way, seismic objects are the most granular unit of access control. Each seismic object has one corresponding partition. If a user is restricted to a specific partition, they will only be able to view the seismic objects that have been assigned to that partition.
Retrieves a SEG-Y file with all traces contained within the given seismic object.
seismicId required | integer The identifier of a seismic object |
{- "error": {
- "code": 401,
- "message": "Could not authenticate.",
- "missing": [
- { }
], - "duplicated": [
- { }
]
}
}
Download multiple seismic objects specified by the filter, as a streamed ZIP archive file.
The filter that determines the seismic objects to return.
items required | Array of integers <int64> (CogniteInternalId) [ items <int64 > [ 1 .. 9007199254740991 ] ] The list of seismic objects to include in the ZIP archive, specified by internal id. |
{- "items": [
- 1
]
}
{- "error": {
- "code": 401,
- "message": "Could not authenticate.",
- "missing": [
- { }
], - "duplicated": [
- { }
]
}
}
The models to create.
required | Array of objects (CreateModel3D) [ 1 .. 1000 ] items |
{- "items": [
- {
- "name": "My Model",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}
}
]
}
{- "items": [
- {
- "name": "My Model",
- "id": 1000,
- "createdTime": 0,
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}
}
]
}
List of models to delete.
required | Array of objects (DataIdentifier) [ 1 .. 1000 ] items unique List of ID objects |
{- "items": [
- {
- "id": 1
}
]
}
{ }
Retrieves a list of all models in a project. This operation supports pagination. You can filter out all models without a published revision.
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
published | boolean Filter based on whether or not it has published revisions. |
const models3D = await client.models3D.list({ published: true });
{- "items": [
- {
- "name": "My Model",
- "id": 1000,
- "createdTime": 0,
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}
}
], - "nextCursor": "string"
}
modelId required | integer <int64> Model ID. |
await client.models3D.retrieve(3744350296805509);
{- "name": "My Model",
- "id": 1000,
- "createdTime": 0,
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}
}
List of changes.
required | Array of objects (UpdateModel3D) [ 1 .. 1000 ] items |
{- "items": [
- {
- "id": 1,
- "update": {
- "name": {
- "set": "string"
}, - "dataSetId": {
- "set": 1
}, - "metadata": {
- "set": {
- "key1": "value1",
- "key2": "value2"
}
}
}
}
]
}
{- "items": [
- {
- "name": "My Model",
- "id": 1000,
- "createdTime": 0,
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}
}
]
}
modelId required | integer <int64> Model ID. |
The revisions to create.
required | Array of objects (CreateRevision3D) [ 1 .. 1000 ] items |
{- "items": [
- {
- "published": false,
- "rotation": [
- 0,
- 0,
- 0
], - "scale": [
- 1,
- 1,
- 1
], - "translation": [
- 0,
- 0,
- 0
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "camera": {
- "target": [
- 0,
- 0,
- 0
], - "position": [
- 0,
- 0,
- 0
]
}, - "fileId": 0
}
]
}
{- "items": [
- {
- "id": 1000,
- "fileId": 1000,
- "published": false,
- "rotation": [
- 0,
- 0,
- 0
], - "scale": [
- 1,
- 1,
- 1
], - "translation": [
- 0,
- 0,
- 0
], - "camera": {
- "target": [
- 0,
- 0,
- 0
], - "position": [
- 0,
- 0,
- 0
]
}, - "status": "Done",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "thumbnailThreedFileId": 1000,
- "assetMappingCount": 0,
- "createdTime": 0
}
]
}
modelId required | integer <int64> Model ID. |
List of revisions ids to delete.
required | Array of objects (DataIdentifier) [ 1 .. 1000 ] items unique List of ID objects |
{- "items": [
- {
- "id": 1
}
]
}
{ }
List nodes in a project, filtered by node names or node property values specified by supplied filters. This operation supports pagination and partitions.
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
object (Node3DPropertyFilter) Filters used in the search. | |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to return. |
cursor | string |
partition | string (Partition) Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
{- "filter": {
- "properties": {
- "PDMS": {
- "Area": [
- "AB76",
- "AB77",
- "AB78"
], - "Type": [
- "PIPE",
- "BEND",
- "PIPESUP"
]
}
}
}, - "limit": 100,
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo",
- "partition": "1/10"
}
{- "items": [
- {
- "id": 1000,
- "treeIndex": 3,
- "parentId": 2,
- "depth": 2,
- "name": "Node name",
- "subtreeSize": 4,
- "properties": {
- "category1": {
- "property1": "value1",
- "property2": "value2"
}, - "category2": {
- "property1": "value1",
- "property2": "value2"
}
}, - "boundingBox": {
- "max": [
- 0,
- 0,
- 0
], - "min": [
- 0,
- 0,
- 0
]
}
}
], - "nextCursor": "string"
}
Retrieves specific nodes given by a list of IDs.
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
The request body containing the IDs of the nodes to retrieve.
required | Array of objects (Node3DId) [ 1 .. 1000 ] items |
{- "items": [
- {
- "id": 1000
}
]
}
{- "items": [
- {
- "id": 1000,
- "treeIndex": 3,
- "parentId": 2,
- "depth": 2,
- "name": "Node name",
- "subtreeSize": 4,
- "properties": {
- "category1": {
- "property1": "value1",
- "property2": "value2"
}, - "category2": {
- "property1": "value1",
- "property2": "value2"
}
}, - "boundingBox": {
- "max": [
- 0,
- 0,
- 0
], - "min": [
- 0,
- 0,
- 0
]
}
}
]
}
Retrieves a list of ancestor nodes of a given node, including itself, in the hierarchy of the 3D model. This operation supports pagination.
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
nodeId required | integer <int64> ID of the node to get the ancestors of. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
const nodes3d = await client.revisions3D.list3DNodeAncestors(8252999965991682, 4190022127342195, 572413075141081);
{- "items": [
- {
- "id": 1000,
- "treeIndex": 3,
- "parentId": 2,
- "depth": 2,
- "name": "Node name",
- "subtreeSize": 4,
- "properties": {
- "category1": {
- "property1": "value1",
- "property2": "value2"
}, - "category2": {
- "property1": "value1",
- "property2": "value2"
}
}, - "boundingBox": {
- "max": [
- 0,
- 0,
- 0
], - "min": [
- 0,
- 0,
- 0
]
}
}
], - "nextCursor": "string"
}
Retrieves a list of nodes from the hierarchy in the 3D model. You can also request a specific subtree with the 'nodeId' query parameter and limit the depth of the resulting subtree with the 'depth' query parameter. By default, nodes are returned in order of ascending treeIndex. We suggest trying to set the query parameter sortByNodeId
to true
to check whether it makes your use case faster. The partition
parameter can only be used if sortByNodeId
is set to true
. This operation supports pagination.
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
partition | string Example: partition=1/10 Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
depth | integer <int32> Get sub nodes up to this many levels below the specified node. Depth 0 is the root node. |
nodeId | integer <int64> ID of a node that are the root of the subtree you request (default is the root node). |
sortByNodeId | boolean Default: false Enable sorting by nodeId. When this parameter is |
properties | string <jsonObject(jsonObject(string))> Example: Filter for node properties. Only nodes that match all the given properties exactly will be listed.
The filter must be a JSON object with the same format as the |
const nodes3d = await client.revisions3D.list3DNodes(8252999965991682, 4190022127342195);
{- "items": [
- {
- "id": 1000,
- "treeIndex": 3,
- "parentId": 2,
- "depth": 2,
- "name": "Node name",
- "subtreeSize": 4,
- "properties": {
- "category1": {
- "property1": "value1",
- "property2": "value2"
}, - "category2": {
- "property1": "value1",
- "property2": "value2"
}
}, - "boundingBox": {
- "max": [
- 0,
- 0,
- 0
], - "min": [
- 0,
- 0,
- 0
]
}
}
], - "nextCursor": "string"
}
List log entries for the revision
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
severity | integer <int64> Default: 5 Minimum severity to retrieve (3 = INFO, 5 = WARN, 7 = ERROR). |
{- "items": [
- {
- "timestamp": 0,
- "severity": 7,
- "type": "CONVERTER/FAILED",
- "info": "string"
}
]
}
Retrieves a list of all revisions of a model. This operation supports pagination. You can also filter revisions if they are marked as published or not by using the query param published.
modelId required | integer <int64> Model ID. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
published | boolean Filter based on published status. |
const revisions3D = await client.revisions3D.list(324566546546346);
{- "items": [
- {
- "id": 1000,
- "fileId": 1000,
- "published": false,
- "rotation": [
- 0,
- 0,
- 0
], - "scale": [
- 1,
- 1,
- 1
], - "translation": [
- 0,
- 0,
- 0
], - "camera": {
- "target": [
- 0,
- 0,
- 0
], - "position": [
- 0,
- 0,
- 0
]
}, - "status": "Done",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "thumbnailThreedFileId": 1000,
- "assetMappingCount": 0,
- "createdTime": 0
}
], - "nextCursor": "string"
}
Retrieve a list of available outputs for a processed 3D model. An output can be a format that can be consumed by a viewer (e.g. Reveal) or import in external tools. Each of the outputs will have an associated version which is used to identify the version of output format (not the revision of the processed output). Note that the structure of the outputs will vary and is not covered here.
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
format | string Format identifier, e.g. 'ept-pointcloud' (point cloud). Well known formats are: 'ept-pointcloud' (point cloud data) or 'reveal-directory' (output supported by Reveal). 'all-outputs' can be used to retrieve all outputs for a 3D revision. Note that some of the outputs are internal, where the format and availability might change without warning. |
{- "items": [
- {
- "format": "ept-pointcloud",
- "version": 1,
- "blobId": 1
}
]
}
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
const revisions3D = await client.revisions3D.retrieve(8252999965991682, 4190022127342195)
{- "id": 1000,
- "fileId": 1000,
- "published": false,
- "rotation": [
- 0,
- 0,
- 0
], - "scale": [
- 1,
- 1,
- 1
], - "translation": [
- 0,
- 0,
- 0
], - "camera": {
- "target": [
- 0,
- 0,
- 0
], - "position": [
- 0,
- 0,
- 0
]
}, - "status": "Done",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "thumbnailThreedFileId": 1000,
- "assetMappingCount": 0,
- "createdTime": 0
}
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
The request body containing the file ID of the thumbnail image (from Files API).
fileId required | integer <int64> File ID of thumbnail file in Files API. Only JPEG and PNG files are supported. |
{- "fileId": 0
}
{ }
modelId required | integer <int64> Model ID. |
List of changes.
required | Array of objects (UpdateRevision3D) [ 1 .. 1000 ] items |
{- "items": [
- {
- "id": 1,
- "update": {
- "published": {
- "set": true
}, - "rotation": {
- "set": [
- 0,
- 0,
- 0
]
}, - "scale": {
- "set": [
- 0,
- 0,
- 0
]
}, - "translation": {
- "set": [
- 0,
- 0,
- 0
]
}, - "camera": {
- "set": {
- "target": [
- 0,
- 0,
- 0
], - "position": [
- 0,
- 0,
- 0
]
}
}, - "metadata": {
- "set": {
- "key1": "value1",
- "key2": "value2"
}
}
}
}
]
}
{- "items": [
- {
- "id": 1000,
- "fileId": 1000,
- "published": false,
- "rotation": [
- 0,
- 0,
- 0
], - "scale": [
- 1,
- 1,
- 1
], - "translation": [
- 0,
- 0,
- 0
], - "camera": {
- "target": [
- 0,
- 0,
- 0
], - "position": [
- 0,
- 0,
- 0
]
}, - "status": "Done",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "thumbnailThreedFileId": 1000,
- "assetMappingCount": 0,
- "createdTime": 0
}
]
}
Retrieve the contents of a 3D file.
This endpoint supported tag-based caching.
This endpoint is only compatible with 3D file IDs from the 3D API, and not compatible with file IDs from the Files API.
threedFileId required | integer <int64> The ID of the 3D file to retrieve. |
await client.files3D.retrieve(3744350296805509);
{- "error": {
- "code": 401,
- "message": "Could not authenticate.",
- "missing": [
- { }
], - "duplicated": [
- { }
]
}
}
Create asset mappings
Asset references when creating a mapping - through asset ids - are allowed to be invalid. They are NOT maintained by any means from CDF, meaning they will be stored until the reference is removed through the delete endpoint of 3d asset mappings.
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
The asset mappings to create.
required | Array of objects (CreateAssetMapping3D) [ 1 .. 1000 ] items |
{- "items": [
- {
- "nodeId": 1003,
- "assetId": 3001
}
]
}
{- "items": [
- {
- "nodeId": 1003,
- "assetId": 3001
}
]
}
Delete a list of asset mappings
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
The IDs of the asset mappings to delete.
required | Array of objects (DeleteAssetMapping3D) [ 1 .. 1000 ] items |
{- "items": [
- {
- "nodeId": 1003,
- "assetId": 3001
}
]
}
{ }
Lists 3D assets mappings that match the specified filter parameter. Only
one type of filter can be specified for each request, either assetIds
, nodeIds
or treeIndexes
.
Asset references obtained from a mapping - through asset ids - may be invalid, simply by the non-transactional nature of HTTP. They are NOT maintained by any means from CDF, meaning they will be stored until the reference is removed through the delete endpoint of 3d asset mappings.
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
The filter for asset mappings to get.
AssetMapping3DAssetFilter (object) or AssetMapping3DNodeFilter (object) or AssetMapping3DTreeIndexFilter (object) | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Limits the number of results to return. |
cursor | string |
{- "filter": {
- "assetIds": [
- 0
]
}, - "limit": 100,
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo"
}
{- "items": [
- {
- "nodeId": 1003,
- "assetId": 3001
}
], - "nextCursor": "string"
}
List all asset mappings
Asset references obtained from a mapping - through asset ids - may be invalid, simply by the non-transactional nature of HTTP. They are NOT maintained by any means from CDF, meaning they will be stored until the reference is removed through the delete endpoint of 3d asset mappings.
modelId required | integer <int64> Model ID. |
revisionId required | integer <int64> Revision ID. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
nodeId | integer <int64> |
assetId | integer <int64> |
intersectsBoundingBox | string Example: If given, only return asset mappings for assets whose bounding box intersects the given bounding box. Must be a JSON object with |
const mappings3D = await client.assetMappings3D.list(3244265346345, 32423454353545);
{- "items": [
- {
- "nodeId": 1003,
- "assetId": 3001
}
], - "nextCursor": "string"
}
Retrieves a list of node IDs
from the hierarchy of all available 3D models which are mapped to the supplied asset ID
. If a node ID
is mapped to the asset ID
but is invalid or does not exist anymore, it will be omitted from the results.
Asset references obtained from a mapping - through asset id - may be invalid, simply by the non-transactional nature of HTTP. They are NOT maintained by any means from CDF, meaning they will be stored until the reference is removed through the delete endpoint of 3d asset mappings.
assetId required | integer <int64> Asset ID. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
{- "items": [
- {
- "modelId": 3001,
- "revisionId": 3001,
- "nodeId": 1003
}
]
}
The entity matching contextualization endpoints lets you match CDF resources. For example, you can match time series to assets. The model uses similarity between string-fields from the source and the target to find potential matches, for instance the source name and the target name. The exact algorithm may change over time.
Train a model that predicts matches between entities (for example, time series names to asset names). This is also known as fuzzy joining. If there are no trueMatches (labeled data), you train a static (unsupervised) model, otherwise a machine learned (supervised) model is trained.
sources required | Array of objects (Sources) [ 0 .. 2000000 ] items List of custom source object to match from, for example, time series. String key -> value. Only string values are considered in the matching. Both |
targets required | Array of objects (Targets) [ 1 .. 2000000 ] items List of custom target object to match to, for example, assets. String key -> value. Only string values are considered in the matching. Both |
Array of objects or objects or objects or objects (TrueMatches) [ 1 .. 2000000 ] items [ items ] List of objects of pairs of sourceId or sourceExternalId and targetId or targetExternalId, that corresponds to entities in source and target respectively, that indicates a confirmed match used to train the model. If omitted, an unsupervised model is used. | |
externalId | string (CogniteExternalId) <= 255 characters The external ID provided by the client. Must be unique for the resource type. |
name | string (ModelName) <= 256 characters User defined name. |
description | string (ModelDescription) <= 500 characters User defined description. |
featureType | string Default: "simple" Enum: "simple" "insensitive" "bigram" "frequencyweightedbigram" "bigramextratokenizers" "bigramcombo" Each feature type defines one combination of features that will be created and used in the entity matcher model. All features are based on matching tokens. Tokens are defined at the top of the Entity matching section. The options are:
|
Array of objects (MatchFields) Default: [{"source":"name","target":"name"}] List of pairs of fields from the target and source items used to calculate features. All source and target items should have all the | |
classifier | string (Classifier) Default: "randomforest" Enum: "randomforest" "decisiontree" "logisticregression" "augmentedlogisticregression" "augmentedrandomforest" The classifier used in the model. Only relevant if there are trueMatches/labeled data and a supervised model is fitted. |
ignoreMissingFields | boolean (IgnoreMissingFields) Default: false If True, replaces missing fields in |
{- "sources": [
- {
- "id": 10,
- "name": "a_name",
- "field": "value",
- "ignoredfield": {
- "key": "value"
}
}
], - "targets": [
- {
- "id": 6,
- "name": "some_name",
- "somefield": "value",
- "ignoredfield": {
- "key": "value"
}
}
], - "trueMatches": [
- {
- "sourceId": 1,
- "targetId": 1
}, - {
- "sourceExternalId": "2",
- "targetExternalId": "2"
}
], - "externalId": "my.known.id",
- "name": "simple_model_1",
- "description": "Simple model 1",
- "featureType": "simple",
- "matchFields": [
- {
- "source": "name",
- "target": "name"
}, - {
- "source": "field",
- "target": "somefield"
}
], - "classifier": "randomforest",
- "ignoreMissingFields": true
}
{- "id": 1,
- "externalId": "my.known.id",
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "name": "simple_model_1",
- "description": "Simple model 1",
- "featureType": "simple",
- "matchFields": [
- {
- "source": "name",
- "target": "name"
}, - {
- "source": "field",
- "target": "somefield"
}
], - "ignoreMissingFields": true,
- "classifier": "randomforest",
- "originalId": 111
}
Deletes an entity matching model. Currently, this is a soft delete, and only removes the entry from listing.
required | Array of objects or objects (OneOfId) List of ids or externalIds of models. |
{- "items": [
- {
- "id": 2563587950655335
}, - {
- "externalId": "myUniqueName"
}
]
}
{ }
Use filtering options to find entity matcher models.
limit | integer <int32> [ 1 .. 1000 ] Default: 100 <- Limits the number of results to return. |
required | object Filter on models with strict matching. |
{- "limit": 100,
- "filter": {
- "featureType": "simple",
- "classifier": "randomforest",
- "originalId": 111,
- "name": "simple_model_1",
- "description": "Simple model 1"
}
}
{- "items": [
- {
- "id": 1,
- "externalId": "my.known.id",
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "name": "simple_model_1",
- "description": "Simple model 1",
- "featureType": "simple",
- "matchFields": [
- {
- "source": "name",
- "target": "name"
}, - {
- "source": "field",
- "target": "somefield"
}
], - "ignoreMissingFields": true,
- "classifier": "randomforest",
- "originalId": 111
}
]
}
List all available entity matching models.
limit | integer >= 1 Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
{- "items": [
- {
- "id": 1,
- "externalId": "my.known.id",
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "name": "simple_model_1",
- "description": "Simple model 1",
- "featureType": "simple",
- "matchFields": [
- {
- "source": "name",
- "target": "name"
}, - {
- "source": "field",
- "target": "somefield"
}
], - "ignoreMissingFields": true,
- "classifier": "randomforest",
- "originalId": 111
}
]
}
Predicts entity matches using a trained model. Note: 'assetsAcl:READ' capability is required unless both sources
and targets
are specified in the request. Also note that the header of a successful response contains a X-Job-Token
which allows to fetch the result of the job at /context/entitymatching/jobs/{jobId}
without requiring 'assetsAcl:READ'.
id required | integer <int64> [ 1 .. 9007199254740991 ] The ID of the model that is used to predict matches. |
sources | Array of objects [ 0 .. 2000000 ] items List of source entities to predict matches for, for example, time series. If omitted, will use |
targets | Array of objects [ 1 .. 2000000 ] items List of potential target entities to match to one or more of the source entities, for example, assets. If omitted, will use |
numMatches | integer [ 0 .. 100 ] The maximum number of results to return for each source entity. |
scoreThreshold | number [ 0 .. 1 ] Only return matches with score above this threshold. |
{- "externalId": "my.known.id",
- "sources": [
- {
- "id": 10,
- "name": "a_name",
- "field": "value",
- "ignoredfield": {
- "key": "value"
}
}
], - "targets": [
- {
- "id": 6,
- "name": "some_name",
- "somefield": "value",
- "ignoredfield": {
- "key": "value"
}
}
], - "numMatches": 3,
- "scoreThreshold": 0.7
}
{- "jobId": 123,
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0
}
Creates a new model by re-training an existing model on existing data but with additional true matches. The old model is not changed. The new model gets a new id and new external id if newExternalId
is set, or no external id if newExternalId
is not set. Use for efficient re-training of the model after a user creates additional confirmed matches.
id required | integer <int64> [ 1 .. 9007199254740991 ] The ID of the original model. |
newExternalId | string <= 255 characters ExternalId for the new refitted model provided by client. Must be unique within the project. |
required | Array of objects or objects or objects or objects [ 1 .. 2000000 ] items [ items ] List of additional confirmed matches used to train the model. The new model uses a combination of this and trueMatches from the orginal model. If there are identical match-from ids, the pair from the original model is dropped. |
sources | Array of objects [ 0 .. 2000000 ] items List of source entities, for example, time series. If omitted, will use data from fit. |
targets | Array of objects [ 1 .. 2000000 ] items List of target entities, for example, assets. If omitted, will use data from fit. |
{- "externalId": "my.known.id",
- "newExternalId": "my.known.id",
- "trueMatches": [
- {
- "sourceId": 1,
- "targetId": 1
}, - {
- "sourceExternalId": "2",
- "targetExternalId": "2"
}
], - "sources": [
- {
- "id": 10,
- "name": "a_name",
- "field": "value",
- "ignoredfield": {
- "key": "value"
}
}
], - "targets": [
- {
- "id": 6,
- "name": "some_name",
- "somefield": "value",
- "ignoredfield": {
- "key": "value"
}
}
]
}
{- "id": 1,
- "externalId": "my.known.id",
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "name": "simple_model_1",
- "description": "Simple model 1",
- "featureType": "simple",
- "matchFields": [
- {
- "source": "name",
- "target": "name"
}, - {
- "source": "field",
- "target": "somefield"
}
], - "ignoreMissingFields": true,
- "classifier": "randomforest",
- "originalId": 111
}
Shows the status of the model. If the status is completed, shows the parameters used to train the model.
id required | integer <int64> (CogniteInternalId) [ 1 .. 9007199254740991 ] A server-generated ID for the object. |
{- "id": 1,
- "externalId": "my.known.id",
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "name": "simple_model_1",
- "description": "Simple model 1",
- "featureType": "simple",
- "matchFields": [
- {
- "source": "name",
- "target": "name"
}, - {
- "source": "field",
- "target": "somefield"
}
], - "ignoreMissingFields": true,
- "classifier": "randomforest",
- "originalId": 111
}
Get the results from a predict job. Note: 'assetsAcl:READ' capability is required, unless you specify a valid X-Job-Token
in the request header. The X-Job-Token
is provided in the response header of the initial call to /context/entitymatching/predict
jobId required | integer <int64> (JobId) Example: 123 Contextualization job ID. |
X-Job-Token | string (JobToken) A string token that can be attached to the header of a request fetching the job status. Authenticates the user fetching the job status as the same one who originally posted the job. |
{- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "jobId": 123,
- "items": [
- {
- "source": {
- "field": "value",
- "ignoredfield": {
- "key": "value"
}
}, - "matches": [
- {
- "score": 0.98,
- "target": {
- "field": "value",
- "ignoredfield": {
- "key": "value"
}
}
}
]
}
]
}
Retrieve entity matching models by IDs or external IDs.
required | Array of objects or objects (OneOfId) List of ids or externalIds of models. |
{- "items": [
- {
- "id": 2563587950655335
}, - {
- "externalId": "myUniqueName"
}
]
}
{- "items": [
- {
- "id": 1,
- "externalId": "my.known.id",
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "name": "simple_model_1",
- "description": "Simple model 1",
- "featureType": "simple",
- "matchFields": [
- {
- "source": "name",
- "target": "name"
}, - {
- "source": "field",
- "target": "somefield"
}
], - "ignoreMissingFields": true,
- "classifier": "randomforest",
- "originalId": 111
}
]
}
Update entity matching models by IDs or external IDs.
required | Array of ModelChangeById (object) or ModelChangeByExternalId (object) (ModelChange) |
{- "items": [
- {
- "update": {
- "name": {
- "set": "simple_model_1"
}, - "description": {
- "set": "Simple model 1"
}
}, - "id": 1
}
]
}
{- "items": [
- {
- "id": 1,
- "externalId": "my.known.id",
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "name": "simple_model_1",
- "description": "Simple model 1",
- "featureType": "simple",
- "matchFields": [
- {
- "source": "name",
- "target": "name"
}, - {
- "source": "field",
- "target": "somefield"
}
], - "ignoreMissingFields": true,
- "classifier": "randomforest",
- "originalId": 111
}
]
}
Convert interactive engineering diagrams to image format, with highlighted annotations. Supported input file mime_types are application/pdf, image/jpeg, image/png, image/tiff. Supported output image formats are PNG and SVG, only the svg embeds the input annotations.
required | Array of objects or objects (DiagramConvertRequestSchema) [ 1 .. 50 ] items An array of files and annotations to create interactive diagrams. |
grayscale | boolean (Grayscale) Default: true Return the SVG version in grayscale colors only (reduces the file size). |
{- "items": [
- {
- "fileId": 1234,
- "annotations": [
- {
- "text": "21-PT-1019",
- "confidence": 0.5,
- "region": {
- "shape": "rectangle",
- "vertices": [
- {
- "x": 0.58,
- "y": 0.12
}, - {
- "x": 0.58,
- "y": 0.12
}, - {
- "x": 0.58,
- "y": 0.12
}, - {
- "x": 0.58,
- "y": 0.12
}
], - "page": 1
}, - "entities": [
- {
- "userDefinedField": "21PT1017",
- "ignoredField": "AA11"
}, - {
- "userDefinedField": [
- "21PT1017-A",
- "21PT1017-B"
]
}
]
}
]
}
], - "grayscale": true
}
{- "items": [
- {
- "fileId": 1234
}
], - "jobId": 123,
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "grayscale": true
}
Detect annotations in engineering diagrams. Note: All users in a CDF project with assets read-all and files read-all capabilities can access data sent to this endpoint. Supported input file mime_types are application/pdf, image/jpeg, image/png, image/tiff.
required | Array of objects or objects (FileReferenceWithPageRange) [ 1 .. 50 ] items Files to run entity detection on. |
entities required | Array of objects (DiagramDetectEntities) [ 1 .. 500000 ] items [ items <= 256 properties ] A list of entities to look for. For example, all the assets under a root node. The |
searchField | string (DiagramSearchField) Default: "name" This field determines the string to search for and to identify object entities. |
partialMatch | boolean (DiagramPartialMatch) Default: false Allow partial (fuzzy) matching of entities in the engineering diagrams. Creates a match only when it is possible to do so unambiguously. |
minTokens | integer (DiagramMinTokens) Default: 2 Each detected item must match the detected entity on at least this number of tokens. A token is a substring of consecutive letters or digits. |
{- "items": [
- {
- "pageRange": {
- "begin": 51,
- "end": 100
}, - "fileId": 1234
}
], - "entities": [
- {
- "userDefinedField": "21PT1017",
- "ignoredField": "AA11"
}, - {
- "userDefinedField": [
- "21PT1017-A",
- "21PT1017-B"
]
}
], - "searchField": "userDefinedField",
- "partialMatch": false,
- "minTokens": 2
}
{- "items": [
- {
- "pageRange": {
- "begin": 51,
- "end": 100
}, - "fileId": 1234
}
], - "jobId": 123,
- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "searchField": "userDefinedField",
- "partialMatch": false,
- "minTokens": 2
}
Get the results for converting an engineering diagram to SVG and PNG formats.
jobId required | integer <int64> (JobId) Example: 123 Contextualization job ID. |
{- "jobId": 123,
- "status": "Queued",
- "items": [
- {
- "fileId": 1234,
- "fileExternalId": "1234",
- "results": [
- {
- "page": 1,
}
]
}
], - "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "grayscale": true
}
Get the results from an engineering diagram detect job.
jobId required | integer <int64> (JobId) Example: 123 Contextualization job ID. |
{- "jobId": 123,
- "status": "Queued",
- "items": [
- {
- "fileId": 1234,
- "fileExternalId": "1234",
- "annotations": [
- {
- "text": "21-PT-1019",
- "confidence": 0.5,
- "region": {
- "shape": "rectangle",
- "vertices": [
- {
- "x": 0.58,
- "y": 0.12
}, - {
- "x": 0.58,
- "y": 0.12
}, - {
- "x": 0.58,
- "y": 0.12
}, - {
- "x": 0.58,
- "y": 0.12
}
], - "page": 1
}, - "entities": [
- {
- "userDefinedField": "21PT1017",
- "ignoredField": "AA11"
}, - {
- "userDefinedField": [
- "21PT1017-A",
- "21PT1017-B"
]
}
]
}
]
}
], - "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "searchField": "userDefinedField",
- "partialMatch": false,
- "minTokens": 2
}
The Vision contextualization endpoints enable extraction of information from imagery data based on their visual content. For example, you can detect external ID or name of assets, detect and read value of gauges or identify common industrial objects in images.
This service has support for batch processing which enables processing of multiple image files via an asynchronous prediction request. A new contextualization job is triggered by sending a POST request to the service. The response of the POST request contains a job ID, which can then be used to make subsequent calls to check the status and retrieve the results of the job once it is completed.
Start an asynchronous prediction job for extracting features such as text, asset tags or industrial objects from images. The response of the POST request contains a job ID, which can be used to make subsequent (GET) calls to check the status and retrieve the results of the job (see Retrieve results from a feature extraction job).
It is possible to have up to 20 concurrent jobs per CDF project.
The files referenced by items
in the request body must fulfill the following requirements:
.jpeg
, .jpg
or .png
image/png
or image/jpeg
as mimeType
New feature extractors may be added in the future.
cdf-version | string Example: alpha cdf version header. Use this to specify the requested CDF release. |
A request for running an extract job.
required | Array of objects or objects (FileReference) [ 1 .. 100 ] List of image files to be analyzed by the feature extractors. |
required | Array of TextDetection (string) or AssetTagDetection (string) or PeopleDetection (string) or IndustrialObjectDetection (string) or PersonalProtectiveEquipmentDetection (string) (VisionExtractFeature) unique The type of detections to perform. New feature extractors may appear. |
object (FeatureParameters) Feature-specific parameters. New feature extractor parameters may appear. |
{- "items": [
- {
- "fileId": 1234
}
], - "features": [
- "TextDetection",
- "AssetTagDetection",
- "PeopleDetection"
], - "parameters": {
- "textDetectionParameters": {
- "threshold": 0.8
}, - "assetTagDetectionParameters": {
- "threshold": 0.8,
- "partialMatch": true,
- "assetSubtreeIds": [
- 1,
- 2
]
}, - "peopleDetectionParameters": {
- "threshold": 0.8
}
}
}
{- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "jobId": 123,
- "items": [
- {
- "fileId": 1234,
- "fileExternalId": "1234"
}
], - "features": [
- "TextDetection",
- "AssetTagDetection",
- "PeopleDetection"
], - "parameters": {
- "textDetectionParameters": {
- "threshold": 0.8
}, - "assetTagDetectionParameters": {
- "threshold": 0.8,
- "partialMatch": true,
- "assetSubtreeIds": [
- 1,
- 2
]
}, - "peopleDetectionParameters": {
- "threshold": 0.8
}
}
}
Retrieve results from a feature extraction job on images.
Note that since files are split up into batches and processed independently of each other, the items in successfully completed batches will be returned even if files in other batches are still being processed. The job status will be Running
until all batches have been processed. If one of the items in a batch fails, the results from items in other completed batches will still be returned. The corresponding items and error message(s) of failed batches will be populated in failedItems
.
Additionally, the status of the job is set to Completed
if at least one batch is successfully completed, otherwise the status is set to Failed
.
jobId required | integer <int64> (JobId) Example: 123 Contextualization job ID. |
cdf-version | string Example: alpha cdf version header. Use this to specify the requested CDF release. |
// get an existing job, wait for it to complete, and get the results const { items } = await client.vision.getExtractJob(1, true); items.forEach((item) => { const predictions = item.predictions; // do something with the predictions })
{- "status": "Queued",
- "createdTime": 0,
- "startTime": 0,
- "statusTime": 0,
- "jobId": 123,
- "items": [
- {
- "fileId": 1234,
- "fileExternalId": "1234",
- "predictions": {
- "textPredictions": [
- {
- "confidence": 0.9,
- "text": "string",
- "textRegion": {
- "xMin": 0.5,
- "xMax": 0.9,
- "yMin": 0.5,
- "yMax": 0.9
}
}
], - "assetTagPredictions": [
- {
- "confidence": 0.9,
- "assetRef": {
- "id": 1233
}, - "text": "string",
- "textRegion": {
- "xMin": 0.5,
- "xMax": 0.9,
- "yMin": 0.5,
- "yMax": 0.9
}
}
], - "peoplePredictions"": [
- {
- "label": "person",
- "confidence": 0.8,
- "boundingBox": {
- "xMin": 0.5,
- "xMax": 0.9,
- "yMin": 0.5,
- "yMax": 0.9
}
}
]
}
}
], - "failedItems": [
- {
- "errorMessage": "string",
- "items": [
- {
- "fileId": 1234,
- "fileExternalId": "1234"
}
]
}
], - "parameters": {
- "textDetectionParameters": {
- "threshold": 0.8
}, - "assetTagDetectionParameters": {
- "threshold": 0.8,
- "partialMatch": true,
- "assetSubtreeIds": [
- 1,
- 2
]
}, - "peopleDetectionParameters": {
- "threshold": 0.8
}
}
}
A document is a file that has been indexed by the document search engine. Every time a file is uploaded, updated or deleted in the Files API, it will also be scheduled for processing by the document search engine. After some processing, it will be possible to search for the file in the document search API.
The document search engine is able to extract content from a variety of document
types, and perform classification, contextualization and other operations on the
file. This extracted and derived information is made available in the form of a
Document
object.
The document structure consists of a selection of derived fields, such as the
title
, author
and language
of the document, plus some of the original fields
from the raw file. The fields from the raw file can be found in the
sourceFile
structure.
We create a document for each uploaded file, but only derive data from certain files.
The following file types are eligible for further data extraction & enrichment:
The aggregation API lets you compute aggregated results on documents, such as getting the count of all documents in a project, checking different authors of documents in a project and the count of documents in each of those aggregations. By specifying an additional filter or search, you can aggregate only among documents matching the specified filter or search.
When you don't specify the aggregate
field in the request
body, the default behavior is to return the count of all matched documents.
Aggregate |
Description |
Example |
---|---|---|
count | Count of documents matching the specified filters and search. |
|
cardinalityValues | Returns an approximate count of distinct values for the specified properties. |
|
cardinalityProperties | Returns an approximate count of
distinct properties for a given property path. Currently only implemented for the
["sourceFile", "metadata"] path. |
|
uniqueValues | Returns top unique values for specified properties (up to the
requested limit) and the count of each in the property specified in properties .
The list will have the highest count first. |
|
uniqueProperties | Returns top unique properties values for specified properties (up to the
requested limit) and the count of each in the property specified in properties .
The list will have the highest count first.
Currently, the |
|
Only some aggregate types currently support
aggregateFilter
Aggregate filtering works directly on the aggregated result. While a normal filter filters relevant documents, aggregate filtering filters the aggregated bucket values. This is useful for e.g. listing metadata keys; while a normal filter will return all metadata keys for related documents, the aggregate filter can be used to reduce the aggregate result even further.
Tip: use both filter
and aggregateFilter
to potentially speed up queries, as the aggregateFilter
is essentially a post filter.
Here we only show metadata keys which starts with "car".
{
"aggregate": "uniqueProperties",
"properties": [{"property": ["sourceFile", "metadata"]}],
"aggregateFilter": {"prefix": {"value": "car"}}
}
Here we only show metadata values which starts with "ctx", for the given metadata key "car-codes".
{
"aggregate": "uniqueValues",
"properties": [{"property": ["sourceFile", "metadata", "car-codes"]}],
"aggregateFilter": {"prefix": {"value": "ctx"}}
}
object | |
(bool filters (and (object) or or (object) or not (object))) or (leaf filters (equals (object) or in (object) or containsAny (object) or containsAll (object) or range (object) or prefix (object) or search (object) or exists (object) or geojsonIntersects (object) or geojsonDisjoint (object) or geojsonWithin (object) or inAssetSubtree (object))) (DocumentFilter) A JSON based filtering language. See detailed documentation above. | |
aggregate | string Default: "count" Value: "count" Count of documents matching the specified filters and search. |
{- "filter": {
- "equals": {
- "property": [
- "type"
], - "value": "PDF"
}
}, - "aggregate": "uniqueValues",
- "properties": [
- {
- "property": [
- "author"
]
}
]
}
{- "items": [
- {
- "count": 10
}
]
}
Retrieves a list of the documents in a project. You can use filters to narrow down the list. Unlike the search endpoint, the pagination isn't restricted to 1000 documents in total, meaning this endpoint can be used to iterate through all the documents in your project.
For more information on how the filtering works, see the documentation for the search endpoint.
Fields to be set for the list request.
(bool filters (and (object) or or (object) or not (object))) or (leaf filters (equals (object) or in (object) or containsAny (object) or containsAll (object) or range (object) or prefix (object) or search (object) or exists (object) or geojsonIntersects (object) or geojsonDisjoint (object) or geojsonWithin (object) or inAssetSubtree (object))) (DocumentFilter) A JSON based filtering language. See detailed documentation above. | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Maximum number of items per page. Use the cursor to get more pages. |
cursor | string Cursor for paging through results. |
{- "filter": {
- "and": [
- {
- "prefix": {
- "property": [
- "name"
], - "value": "Report"
}
}, - {
- "equals": {
- "property": [
- "type"
], - "value": "PDF"
}
}
]
}, - "limit": 100,
- "cursor": "string"
}
{- "items": [
- {
- "id": 2384,
- "externalId": "haml001",
- "title": "Hamlet",
- "author": "William Shakespeare",
- "producer": "string",
- "createdTime": 1519862400000,
- "modifiedTime": 1519958703000,
- "lastIndexedTime": 1521062805000,
- "mimeType": "text/plain",
- "extension": "pdf",
- "pageCount": 2,
- "type": "Document",
- "language": "en",
- "truncatedContent": "ACT I\nSCENE I. Elsinore. A platform before the castle.\n FRANCISCO at his post. Enter to him BERNARDO\nBERNARDO\n Who's there?\n",
- "assetIds": [
- 42,
- 101
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "sourceFile": {
- "name": "hamlet.txt",
- "directory": "plays/shakespeare",
- "source": "SubsurfaceConnectors",
- "mimeType": "application/octet-stream",
- "size": 1000,
- "hash": "23203f9264161714cdb8d2f474b9b641e6a735f8cea4098c40a3cab8743bd749",
- "assetIds": [ ],
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Point",
- "coordinates": [
- 10.74609,
- 59.91273
]
}, - "datasetId": 1,
- "securityCategories": [ ],
- "metadata": {
- "property1": "string",
- "property2": "string"
}
}, - "geoLocation": {
- "type": "Point",
- "coordinates": [
- 10.74609,
- 59.91273
]
}
}
], - "nextCursor": "string"
}
Returns extracted textual information for the given document.
The documents pipeline extracts up to 1MiB of textual information from each processed document. The search and list endpoints truncate the textual content of each document, in order to reduce the size of the returned payload. If you want the whole text for a document, you can use this endpoint.
The accept
request header MUST be set to text/plain
. Other values will
give an HTTP 406 error.
id required | integer <int64> (CogniteInternalId) [ 1 .. 9007199254740991 ] A server-generated ID for the object. |
{- "error": {
- "code": 401,
- "message": "Could not authenticate.",
- "missing": [
- { }
], - "duplicated": [
- { }
]
}
}
This endpoint lets you search for documents by using advanced filters and free text queries. Free text queries are matched against the documents' filenames and contents.
The +
symbol represents an AND operation, and the |
symbol represents an OR.
Searching for lorem + ipsum
will match documents containing both "lorem" AND "ipsum" in the filename or content.
Similarly, searching for lorem | ipsum
will match documents containing either "lorem" OR "ipsum" in the filename or
content.
The default operator between the search keywords is AND.
That means that searching for two terms without any operator, for example, lorem ipsum
, will
match documents containing both the words "lorem" and "ipsum" in the filename or content.
You can use the operator -
to exclude documents containing a specific word.
For instance, the search lorem -ipsum
will match documents that contain the word "lorem", but does NOT contain the
word "ipsum".
Enclose multiple words inside double quotes "
to group these words together.
Normally, the search query lorem ipsum
will match not only "lorem ipsum" but also "lorem cognite ipsum",
and in general, there can be any number of words between the two words in the query.
The search query "lorem ipsum"
, however, will match only exactly "lorem ipsum" and not "lorem cognite ipsum".
To search for the special characters (+
, |
, -
, "
. \
), escape with a preceding backslash \
.
When you search for a term, the endpoint tries to return the most relevant documents first, with less relevant documents further down the list. There are a few factors that determine the relevance of a document:
The following request will return documents matching the specified search query.
{
"search": {
"query": "cognite \"lorem ipsum\""
}
}
The following example combines a query with a filter.
The search request will return documents matching the search query, where externalId
starts with "1".
The results will be ordered by how well they match the query string.
{
"search":{
"query":"cognite \"lorem ipsum\""
},
"filter":{
"prefix":{
"property":[
"externalId"
],
"value":"1"
}
}
}
When you enable highlights for your search query, the response contains an additional highlight field for each search hit, including the highlighted fragments for your query matches. However, the result limit is 20 documents due to the operation costs.
Filtering uses a special JSON filtering language.
It's quite flexible and consists of a number of different "leaf" filters, which can be combined arbitrarily using the boolean clauses and
, or
, and not
.
Leaf filter |
Supported fields |
Description |
---|---|---|
equals | Non-array type fields | Only includes results that are equal to the specified value.
|
in | Non-array type fields | Only includes results that are equal to one of the specified values.
|
containsAll | Array type fields | Only includes results which contain all of the specified values.
|
containsAny | Array type fields | Only includes results which contain all of the specified values.
|
exists | All fields | Only includes results where the specified property exists (has value).
|
prefix | String type fields | Only includes results which start with the specified value.
|
range | Non-array type fields | Only includes results that fall within the specified range.
Supported operators:
|
geojsonIntersects | geoLocation | Only includes results where the geoshape intersects with the specified geometry.
|
geojsonDisjoint | geoLocation | Only includes results where the geoshape has nothing in common with the specified geometry.
|
geojsonWithin | geoLocation | Only includes results where the geoshape falls within the specified geometry.
|
inAssetSubtree | assetIds , assetExternalIds | Only includes results with a related asset in a subtree rooted at any specified IDs.
|
search | name , content |
|
The following overview shows the properties you can filter and which filter applies to which property.
Property | Type | Applicable filters |
---|---|---|
["id"] |
integer | equals, in, range, exists |
["externalId"] |
string | equals, in, prefix, exists |
["title"] |
string | equals, in, prefix, exists |
["author"] |
string | equals, in, prefix, exists |
["createdTime"] |
integer | equals, in, range, exists |
["modifiedTime"] |
integer | equals, in, range, exists |
["lastIndexedTime"] |
integer | equals, in, range, exists |
["mimeType"] |
string | equals, in, prefix, exists |
["extension"] |
string | equals, in, prefix, exists |
["pageCount"] |
integer | equals, in, range, exists |
["type"] |
string | equals, in, prefix, exists |
["geoLocation"] |
geometry object | geojsonIntersects, geojsonDisjoint, geojsonWithin, exists |
["language"] |
string | equals, in, prefix, exists |
["assetIds"] |
array of integers | containsAny, containsAll, exists, inAssetSubtree |
["assetExternalIds"] |
array of strings | containsAny, containsAll, exists, inAssetSubtree |
["labels"] |
array of Labels | containsAny, containsAll, exists |
["content"] |
string | search |
["sourceFile", "name"] |
string | equals, in, prefix, exists, search |
["sourceFile", "mimeType"] |
string | equals, in, prefix, exists |
["sourceFile", "size"] |
integer | equals, in, range, exists |
["sourceFile", "source"] |
string | equals, in, prefix, exists |
["sourceFile", "directory"] |
string | equals, in, prefix, exists |
["sourceFile", "assetIds"] |
array of integers | containsAny, containsAll, exists, inAssetSubtree |
["sourceFile", "assetExternalIds"] |
array of strings | containsAny, containsAll, exists, inAssetSubtree |
["sourceFile", "datasetId"] |
integer | equals, in, range, exists |
["sourceFile", "securityCategories"] |
array of integers | containsAny, containsAll, exists |
["sourceFile", "geoLocation"] |
geometry object | geojsonIntersects, geojsonDisjoint, geojsonWithin, exists |
["sourceFile", "labels"] |
array of Labels | containsAny, containsAll, exists |
["sourceFile", "metadata", <key>] |
string | equals, in, prefix, exists |
["sourceFile", "metadata"] |
string | equals, in, prefix, exists This is a special filter field that targets all metadata values. An alternative to creating a filter for each key in the metadata field. |
{
"filter": {
"and": [
{
"or": [
{
"equals": {
"property": [
"type"
],
"value": "PDF"
}
},
{
"prefix": {
"property": [
"externalId"
],
"value": "hello"
}
}
]
},
{
"range": {
"property": [
"createdTime"
],
"lte": 1519862400000
}
},
{
"not": {
"in": {
"property": [
"sourceFile",
"name"
],
"values": [
"My Document.doc",
"My Other Document.docx"
]
}
}
}
]
}
}
By default, search results are ordered by relevance, meaning how well they match the given query string. However, it's possible to specify a different property to sort by. Sorting can be ascending or descending. The sort order is ascending if none is specified.
The following overview shows all properties that can be sorted on.
Property |
---|
["id"] |
["externalId"] |
["mimeType"] |
["extension"] |
["pageCount"] |
["author"] |
["title"] |
["language"] |
["type"] |
["createdTime"] |
["modifiedTime"] |
["lastIndexedTime"] |
["sourceFile", "name"] |
["sourceFile", "mimeType"] |
["sourceFile", "size"] |
["sourceFile", "source"] |
["sourceFile", "datasetId"] |
["sourceFile", "metadata", *] |
{
"sort":[
{
"property":[
"createdTime"
],
"order":"asc",
}
]
}
project required | string Example: publicdata The project name. |
Fields to be set for the search request.
object | |
(bool filters (and (object) or or (object) or not (object))) or (leaf filters (equals (object) or in (object) or containsAny (object) or containsAll (object) or range (object) or prefix (object) or search (object) or exists (object) or geojsonIntersects (object) or geojsonDisjoint (object) or geojsonWithin (object) or inAssetSubtree (object))) (DocumentFilter) A JSON based filtering language. See detailed documentation above. | |
Array of objects (DocumentSearchCountAggregate) [ 1 .. 5 ] items Deprecated | |
Array of objects (DocumentSortItem) = 1 items List of properties to sort by. Currently only supports 1 property. | |
limit | integer <int32> [ 0 .. 1000 ] Default: 100 Maximum number of items. When using highlights the maximum value is reduced to 20. |
cursor | string Cursor for paging through results. |
highlight | boolean Whether or not matches in search results should be highlighted. |
{- "search": {
- "query": "cognite \"lorem ipsum\"",
- "highlight": false
}, - "filter": {
- "and": [
- {
- "prefix": {
- "property": [
- "name"
], - "value": "Report"
}
}, - {
- "equals": {
- "property": [
- "type"
], - "value": "PDF"
}
}
]
}, - "aggregates": [
- {
- "name": "countOfTypes",
- "aggregate": "count",
- "groupBy": [
- {
- "property": [
- "type"
]
}
]
}
], - "sort": [
- {
- "order": "asc",
- "property": [
- "sourceFile",
- "name"
]
}
], - "limit": 100,
- "cursor": "string",
- "highlight": true
}
{- "items": [
- {
- "highlight": {
- "name": [
- "amet elit <em>non diam</em> aliquam suscipit"
], - "content": [
- "Nunc <em>vulputate erat</em> ipsum, at aliquet ligula vestibulum at",
- "<em>Quisque</em> lectus ex, fringilla aliquet <em>eleifend</em> nec, laoreet a velit.\n\nPhasellus <em>faucibus</em> risus arcu"
]
}, - "item": {
- "id": 2384,
- "externalId": "haml001",
- "title": "Hamlet",
- "author": "William Shakespeare",
- "producer": "string",
- "createdTime": 1519862400000,
- "modifiedTime": 1519958703000,
- "lastIndexedTime": 1521062805000,
- "mimeType": "text/plain",
- "extension": "pdf",
- "pageCount": 2,
- "type": "Document",
- "language": "en",
- "truncatedContent": "ACT I\nSCENE I. Elsinore. A platform before the castle.\n FRANCISCO at his post. Enter to him BERNARDO\nBERNARDO\n Who's there?\n",
- "assetIds": [
- 42,
- 101
], - "labels": [
- {
- "externalId": "my.known.id"
}
], - "sourceFile": {
- "name": "hamlet.txt",
- "directory": "plays/shakespeare",
- "source": "SubsurfaceConnectors",
- "mimeType": "application/octet-stream",
- "size": 1000,
- "hash": "23203f9264161714cdb8d2f474b9b641e6a735f8cea4098c40a3cab8743bd749",
- "assetIds": [ ],
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Point",
- "coordinates": [
- 10.74609,
- 59.91273
]
}, - "datasetId": 1,
- "securityCategories": [ ],
- "metadata": {
- "property1": "string",
- "property2": "string"
}
}, - "geoLocation": {
- "type": "Point",
- "coordinates": [
- 10.74609,
- 59.91273
]
}
}
}
], - "aggregates": [
- {
- "name": "string",
- "groups": [
- {
- "group": [
- {
- "property": [
- "sourceFile",
- "name"
], - "value": "string"
}
], - "count": 0
}
], - "total": 0
}
], - "nextCursor": "string"
}
The document preview service is a utility API that can render most document types as an image or PDF. This can be very helpful if you want to display a preview of a file in a frontend, or for other tasks that require one of these formats.
For both rendered formats there is a concept of a page. The actual meaning of a page depends on the source document. E.g. an image will always have exactly one page, while a spreadsheet will typically have one page representing each individual sheet.
The document preview service can only generate preview for document sizes that do not exceed 150 MiB. Trying to preview a larger document will give an error.
Previews can be created for the following types of files:
This endpoint returns a rendered image preview for a specific page of the specified document.
The accept
request header MUST be set to image/png
. Other values will
give an HTTP 406 error.
The rendered image will be downsampled to a maximum of 2400x2400 pixels. Only PNG format is supported and only the first 10 pages can be rendered.
Previews will be rendered if neccessary during the request. Be prepared for the request to take a few seconds to complete.
documentId required | integer Internal ID for document to preview |
pageNumber required | integer [ 1 .. 10 ] Page number to preview. Starting at 1 for first page |
{- "error": {
- "code": 401,
- "message": "Could not authenticate.",
- "missing": [
- { }
], - "duplicated": [
- { }
]
}
}
This endpoint returns a rendered PDF preview for a specified document.
The accept
request header MUST be set to application/pdf
. Other values will
give an HTTP 406 error.
This endpoint is optimized for in-browser previews. We reserve the right to adjust the quality and other attributes of the output with this in mind. Please reach out to us if you have a different use case and requirements.
Only the 100 first pages will be included.
Previews will be rendered if neccessary during the request. Be prepared for the request to take a few seconds to complete.
documentId required | integer Internal ID for document to preview |
cdf-version | string Example: alpha cdf version header. Use this to specify the requested CDF release. |
{- "error": {
- "code": 401,
- "message": "Could not authenticate.",
- "missing": [
- { }
], - "duplicated": [
- { }
]
}
}
This endpoint works similar as the normal preview endpoint except it returns a short-lived temporary link to download the rendered preview instead of returning the binary data.
documentId required | integer Internal ID for document to preview |
cdf-version | string Example: alpha cdf version header. Use this to specify the requested CDF release. |
{- "temporaryLink": "string",
- "expirationTime": 1519862400000
}
Manage data in the raw NoSQL database. Each project will have a variable number of raw databases, each of which will have a variable number of tables, each of which will have a variable number of key-value objects. Only queries on key are supported through this API.
Create databases in a project. It is possible to post a maximum of 1000 databases per request.
List of names of databases to be created.
Array of objects (RawDB) |
{- "items": [
- {
- "name": "string"
}
]
}
{- "items": [
- {
- "name": "string"
}
]
}
Create tables in a database. It is possible to post a maximum of 1000 tables per request.
dbName required | string Name of the database to create tables in. |
ensureParent | boolean Default: false Create database if it doesn't exist already |
List of tables to create.
Array of objects (RawDBTable) |
{- "items": [
- {
- "name": "string"
}
]
}
{- "items": [
- {
- "name": "string"
}
]
}
It deletes a database, but fails if the database is not empty and recursive is set to false (default).
List of names of the databases to be deleted.
Array of objects (RawDB) | |
recursive | boolean Default: false When true, tables of this database are deleted with the database. |
{- "items": [
- {
- "name": "string"
}
], - "recursive": false
}
{ }
dbName required | string [ 1 .. 32 ] characters Name of the database containing the rows. |
tableName required | string [ 1 .. 64 ] characters Name of the table containing the rows. |
Keys to the rows to delete.
Array of objects (RawDBRowKey) |
{- "items": [
- {
- "key": "string"
}
]
}
{ }
dbName required | string Name of the database to delete tables in. |
List of tables to delete.
Array of objects (RawDBTable) |
{- "items": [
- {
- "name": "string"
}
]
}
{ }
Insert rows into a table. It is possible to post a maximum of 10000 rows per request. It will replace the columns of an existing row if the rowKey already exists.
The rowKey is limited to 1024 characters which also includes Unicode characters. The maximum size of columns are 5 MiB, however the maximum size of one column name and value is 2621440 characters each. If you want to store huge amount of data per row or column we recommend using the Files API to upload blobs, then reference it from the Raw row.
The columns object is a key value object, where the key corresponds to the column name while the value is the column value. It supports all the valid types of values in JSON, so number, string, array, and even nested JSON structure (see payload example to the right).
Note There is no rollback if an error occurs, which means partial data may be written. However, it's safe to retry the request, since this endpoint supports both update and insert (upsert).
dbName required | string [ 1 .. 32 ] characters Name of the database. |
tableName required | string [ 1 .. 64 ] characters Name of the table. |
ensureParent | boolean Default: false Create database/table if it doesn't exist already |
List of rows to create.
Array of objects (RawDBRowInsert) |
{- "items": [
- {
- "key": "some rowKey",
- "columns": {
- "some int-col": 10,
- "some string-col": "string example",
- "some json-col": {
- "test": {
- "foo": "nested"
}
}, - "some array-col": [
- 0,
- 1,
- 3,
- 4
]
}
}
]
}
{ }
limit | integer <int32> [ 1 .. 1000 ] Default: 25 Limit on the number of databases to be returned. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
const databases = await client.raw.listDatabases();
{- "items": [
- {
- "name": "string"
}
], - "nextCursor": "string"
}
dbName required | string The name of a database to retrieve tables from. |
limit | integer <int32> [ 1 .. 1000 ] Default: 25 Limit on the number of tables to be returned. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
const tables = await client.raw.listTables('My company');
{- "items": [
- {
- "name": "string"
}
], - "nextCursor": "string"
}
Retrieve cursors based on the last updated time range. Normally this endpoint is used for reading in parallel.
Each cursor should be supplied as the 'cursor' query parameter on GET requests to Read Rows. Note that the 'minLastUpdatedTime' and the 'maxLastUpdatedTime' query parameter on Read Rows are ignored when a cursor is specified.
dbName required | string Name of the database. |
tableName required | string Name of the table. |
minLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 An exclusive filter, specified as the number of milliseconds that have elapsed since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 An inclusive filter, specified as the number of milliseconds that have elapsed since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
numberOfCursors | integer <int32> [ 1 .. 10000 ] The number of cursors to return, by default it's 10. |
{- "items": [
- "string"
]
}
dbName required | string [ 1 .. 32 ] characters Name of the database to retrieve the row from. |
tableName required | string [ 1 .. 64 ] characters Name of the table to retrieve the row from. |
rowKey required | string Row key of the row to retrieve. |
await client.raw.retrieveRow('My company', 'Customers', 'customer1');
{- "key": "string",
- "columns": { },
- "lastUpdatedTime": 0
}
dbName required | string Name of the database. |
tableName required | string Name of the table. |
limit | integer <int32> [ 1 .. 10000 ] Default: 25 Limit the number of results. The API may return fewer than the specified limit. |
columns | string Example: columns=column1,column2 Ordered list of column keys, separated by commas. Leave empty for all, use single comma to retrieve only row keys. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
minLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 An exclusive filter, specified as the number of milliseconds that have elapsed since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
maxLastUpdatedTime | integer <int64> (EpochTimestamp) >= 0 An inclusive filter, specified as the number of milliseconds that have elapsed since 00:00:00 Thursday, 1 January 1970, Coordinated Universal Time (UTC), minus leap seconds. |
await client.raw.listRows('My company', 'Employees', { columns: ['last_name'] });
{- "items": [
- {
- "key": "string",
- "columns": { },
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
Extraction Pipeline objects represent the applications and software that are deployed to ingest operational data into CDF. An extraction pipeline can consist of a number of different software components between the source system and CDF. The extraction pipeline object represents the software component that actually sends the data to CDF. Two examples are Cognite extractors and third party ETL tools such as Microsoft Azure or Informatica PowerCenter
Creates multiple new extraction pipelines. A maximum of 1000 extraction pipelines can be created per request.
required | Array of objects (CreateExtPipe) [ 1 .. 1000 ] items |
{- "items": [
- {
- "externalId": "string",
- "name": "string",
- "description": "string",
- "dataSetId": 9007199254740991,
- "rawTables": [
- {
- "dbName": "string",
- "tableName": "string"
}
], - "schedule": "string",
- "contacts": [
- {
- "name": "string",
- "email": "user@example.com",
- "role": "string",
- "sendNotification": true
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "documentation": "string",
- "notificationConfig": {
- "allowedNotSeenRangeInMinutes": 0
}, - "createdBy": "string"
}
]
}
{- "items": [
- {
- "externalId": "string",
- "name": "string",
- "description": "string",
- "dataSetId": 9007199254740991,
- "rawTables": [
- {
- "dbName": "string",
- "tableName": "string"
}
], - "schedule": "string",
- "contacts": [
- {
- "name": "string",
- "email": "user@example.com",
- "role": "string",
- "sendNotification": true
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "documentation": "string",
- "notificationConfig": {
- "allowedNotSeenRangeInMinutes": 0
}, - "createdBy": "string",
- "id": 9007199254740991,
- "lastSuccess": 0,
- "lastFailure": 0,
- "lastMessage": "string",
- "lastSeen": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
Delete extraction pipelines for given list of ids and externalIds. When the extraction pipeline is deleted, all extraction pipeline runs related to the extraction pipeline are automatically deleted.
required | Array of ExtPipeInternalId (object) or ExtPipeExternalId (object) (ExtPipeId) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 9007199254740991
}
], - "ignoreUnknownIds": false
}
{ }
Use advanced filtering options to find extraction pipelines.
object (ExtPipesFilter) | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Limits the number of results to return. |
cursor | string |
{- "filter": {
- "externalIdPrefix": "string",
- "name": "string",
- "description": "string",
- "dataSetIds": [
- {
- "id": 1
}
], - "schedule": "string",
- "contacts": [
- {
- "name": "string",
- "email": "user@example.com",
- "role": "string",
- "sendNotification": true
}
], - "rawTables": [
- {
- "dbName": "string",
- "tableName": "string"
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "documentation": "string",
- "createdBy": "string",
- "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}
}, - "limit": 100,
- "cursor": "string"
}
{- "items": [
- {
- "externalId": "string",
- "name": "string",
- "description": "string",
- "dataSetId": 9007199254740991,
- "rawTables": [
- {
- "dbName": "string",
- "tableName": "string"
}
], - "schedule": "string",
- "contacts": [
- {
- "name": "string",
- "email": "user@example.com",
- "role": "string",
- "sendNotification": true
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "documentation": "string",
- "notificationConfig": {
- "allowedNotSeenRangeInMinutes": 0
}, - "createdBy": "string",
- "id": 9007199254740991,
- "lastSuccess": 0,
- "lastFailure": 0,
- "lastMessage": "string",
- "lastSeen": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Returns a list of all extraction pipelines for a given project
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
ep_list = client.extraction_pipelines.list(limit=5)
{- "items": [
- {
- "externalId": "string",
- "name": "string",
- "description": "string",
- "dataSetId": 9007199254740991,
- "rawTables": [
- {
- "dbName": "string",
- "tableName": "string"
}
], - "schedule": "string",
- "contacts": [
- {
- "name": "string",
- "email": "user@example.com",
- "role": "string",
- "sendNotification": true
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "documentation": "string",
- "notificationConfig": {
- "allowedNotSeenRangeInMinutes": 0
}, - "createdBy": "string",
- "id": 9007199254740991,
- "lastSuccess": 0,
- "lastFailure": 0,
- "lastMessage": "string",
- "lastSeen": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
Retrieve an extraction pipeline by its ID. If you want to retrieve extraction pipelines by externalIds, use Retrieve extraction pipelines instead.
id required | integer <int64> (CogniteInternalId) [ 1 .. 9007199254740991 ] A server-generated ID for the object. |
res = client.extraction_pipelines.retrieve(id=1) res = client.extraction_pipelines.retrieve(external_id="1")
{- "externalId": "string",
- "name": "string",
- "description": "string",
- "dataSetId": 9007199254740991,
- "rawTables": [
- {
- "dbName": "string",
- "tableName": "string"
}
], - "schedule": "string",
- "contacts": [
- {
- "name": "string",
- "email": "user@example.com",
- "role": "string",
- "sendNotification": true
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "documentation": "string",
- "notificationConfig": {
- "allowedNotSeenRangeInMinutes": 0
}, - "createdBy": "string",
- "id": 9007199254740991,
- "lastSuccess": 0,
- "lastFailure": 0,
- "lastMessage": "string",
- "lastSeen": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
Retrieves information about multiple extraction pipelines in the same project. All ids and externalIds must be unique.
required | Array of ExtPipeInternalId (object) or ExtPipeExternalId (object) (ExtPipeId) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 9007199254740991
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "externalId": "string",
- "name": "string",
- "description": "string",
- "dataSetId": 9007199254740991,
- "rawTables": [
- {
- "dbName": "string",
- "tableName": "string"
}
], - "schedule": "string",
- "contacts": [
- {
- "name": "string",
- "email": "user@example.com",
- "role": "string",
- "sendNotification": true
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "documentation": "string",
- "notificationConfig": {
- "allowedNotSeenRangeInMinutes": 0
}, - "createdBy": "string",
- "id": 9007199254740991,
- "lastSuccess": 0,
- "lastFailure": 0,
- "lastMessage": "string",
- "lastSeen": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Update information for a list of extraction pipelines. Fields that are not included in the request, are not changed.
required | Array of ExtPipeUpdateById (object) or ExtPipeUpdateByExternalId (object) (ExtPipeUpdate) [ 1 .. 1000 ] items |
{- "items": [
- {
- "id": 9007199254740991,
- "update": {
- "externalId": {
- "set": "string"
}, - "name": {
- "set": "string"
}, - "description": {
- "set": "string"
}, - "dataSetId": {
- "set": 9007199254740991
}, - "schedule": {
- "set": "string"
}, - "rawTables": {
- "set": [
- {
- "dbName": "string",
- "tableName": "string"
}
]
}, - "contacts": {
- "set": [
- {
- "name": "string",
- "email": "user@example.com",
- "role": "string",
- "sendNotification": true
}
]
}, - "metadata": {
- "set": {
- "property1": "string",
- "property2": "string"
}
}, - "source": {
- "set": "string"
}, - "documentation": {
- "set": "string"
}, - "notificationConfig": {
- "set": {
- "allowedNotSeenRangeInMinutes": 0
}
}
}
}
]
}
{- "items": [
- {
- "externalId": "string",
- "name": "string",
- "description": "string",
- "dataSetId": 9007199254740991,
- "rawTables": [
- {
- "dbName": "string",
- "tableName": "string"
}
], - "schedule": "string",
- "contacts": [
- {
- "name": "string",
- "email": "user@example.com",
- "role": "string",
- "sendNotification": true
}
], - "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "documentation": "string",
- "notificationConfig": {
- "allowedNotSeenRangeInMinutes": 0
}, - "createdBy": "string",
- "id": 9007199254740991,
- "lastSuccess": 0,
- "lastFailure": 0,
- "lastMessage": "string",
- "lastSeen": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Extraction Pipelines Runs are CDF objects to store statuses related to an extraction pipeline. The supported statuses are: success, failure and seen. The statuses are related to two different types of operation of the extraction pipeline. Success and failure indicate the status for a particular EP run where the EP attempts to send data to CDF. If the data is successfully posted to CDF the status of the run is ‘success’; if the run has been unsuccessful and the data is not posted to CDF, the status of the run is ‘failure’. Message can be stored to explain run status. Seen is a heartbeat status that indicates that the extraction pipeline is alive. This message is sent periodically on a schedule and indicates that the extraction pipeline is working even though data may not have been sent to CDF by the extraction pipeline.
Create multiple extraction pipeline runs. Current version supports one extraction pipeline run per request. Extraction pipeline runs support three statuses: success, failure, seen. The content of the Error Message parameter is configurable and will contain any messages that have been configured within the extraction pipeline.
required | Array of objects (ExtPipeRunRequest) [ 1 .. 1000 ] items |
{- "items": [
- {
- "externalId": "string",
- "status": "success",
- "message": "string",
- "createdTime": 0
}
]
}
{- "items": [
- {
- "id": 1,
- "status": "string",
- "message": "string",
- "createdTime": 0,
- "externalId": "string"
}
]
}
Use advanced filtering options to find extraction pipeline runs. Sorted by createdTime value with descendant order.
required | object (RunsFilter) |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Limits the number of results to return. |
cursor | string |
{- "filter": {
- "externalId": "string",
- "statuses": [
- "success"
], - "createdTime": {
- "max": 0,
- "min": 0
}, - "message": {
- "substring": "string"
}
}, - "limit": 100,
- "cursor": "string"
}
{- "items": [
- {
- "id": 1,
- "status": "string",
- "message": "string",
- "createdTime": 0
}
], - "nextCursor": "string"
}
List of all extraction pipeline runs for a given extraction pipeline. Sorted by createdTime value with descendant order.
externalId required | string |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
{- "items": [
- {
- "id": 1,
- "status": "string",
- "message": "string",
- "createdTime": 0
}
], - "nextCursor": "string"
}
Extraction Pipelines Configs are configuration file revisions tied to an extraction pipeline. Users can create new configuration revisions, and extractors can fetch the latest, making it easy to deploy configuration files from source control, automated scripts, etc.
Creates a configuration revision for the given extraction pipeline.
externalId required | string [ 1 .. 255 ] characters External ID of the extraction pipeline this configuration revision belongs to. |
config | string Configuration content. |
description | string or null A description of this configuration revision. |
{- "externalId": "string",
- "config": "string",
- "description": "string"
}
{- "externalId": "string",
- "config": "string",
- "revision": 2147483647,
- "createdTime": 0,
- "description": "string"
}
Retrieves a single configuration revision. By default, the latest revision is retrieved.
externalId required | string |
revision | integer <int32> >= 0 Default: 0 |
activeAtTime | integer <int64> >= 0 Default: 0 |
{- "externalId": "string",
- "config": "string",
- "revision": 2147483647,
- "createdTime": 0,
- "description": "string"
}
Lists configuration revisions for the given extraction pipeline.
externalId required | string |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
{- "items": [
- {
- "externalId": "string",
- "revision": 2147483647,
- "createdTime": 0,
- "description": "string"
}
]
}
Reverts the latest configuration revision to an older revision. Equivalent to creating a new revision identical to the old revision.
externalId required | string [ 1 .. 255 ] characters External ID of the extraction pipeline to revert configurations for. |
revision | integer <int32> [ 0 .. 2147483647 ] Revision number of this configuration. |
{- "externalId": "string",
- "revision": 2147483647
}
{- "externalId": "string",
- "config": "string",
- "revision": 2147483647,
- "createdTime": 0,
- "description": "string"
}
Data sets let you document and track data lineage, ensure data integrity, and allow 3rd parties to write their insights securely back to a Cognite Data Fusion (CDF) project.
Data sets group and track data by its source. For example, a data set can contain all work orders originating from SAP. Typically, an organization will have one data set for each of its data ingestion pipelines in CDF.
A data set consists of metadata about the data set, and the data objects that belong to the data set. Data objects, for example events, files, and time series, are added to a data set through the dataSetId
field of the data object. Each data object can belong to only one data set.
To learn more about data sets, see getting started guide
Aggregate data sets in the same project. Criteria can be applied to select a subset of data sets.
object (DataSetFilter) Filter on data sets with strict matching. |
{- "filter": {
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "externalIdPrefix": "my.known.prefix",
- "writeProtected": true
}
}
{- "items": [
- {
- "count": 0
}
]
}
You can create a maximum of 10 data sets per request.
List of the data sets to create.
required | Array of objects (DataSetSpec) [ 1 .. 10 ] items |
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "writeProtected": false
}
]
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "writeProtected": false,
- "id": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Use advanced filtering options to find data sets.
List of IDs of the data sets to retrieve. You can retrieve a maximum of 1000 data sets per request. All IDs must be unique.
object (DataSetFilter) Filter on data sets with strict matching. | |
limit | integer <int32> [ 1 .. 1000 ] Default: 100 Limits the number of results to return. |
cursor | string |
{- "filter": {
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "createdTime": {
- "max": 0,
- "min": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "externalIdPrefix": "my.known.prefix",
- "writeProtected": true
}, - "limit": 100,
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo"
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "writeProtected": false,
- "id": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
Retrieve data sets by IDs or external IDs.
List of the IDs of the data sets to retrieve. You can retrieve a maximum of 1000 data sets per request. All IDs must be unique.
required | Array of DataSetInternalId (object) or DataSetExternalId (object) (DataSetIdEither) [ 1 .. 1000 ] items unique |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "writeProtected": false,
- "id": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
All provided IDs and external IDs must be unique. Fields that are not included in the request, are not changed.
required | Array of DataSetChangeById (object) or DataSetChangeByExternalId (object) (DataSetUpdate) |
{- "items": [
- {
- "update": {
- "externalId": {
- "set": "string"
}, - "name": {
- "set": "string"
}, - "description": {
- "set": "string"
}, - "metadata": {
- "set": {
- "key1": "value1",
- "key2": "value2"
}
}, - "writeProtected": {
- "set": true
}
}, - "id": 1
}
]
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "writeProtected": false,
- "id": 1,
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Creates label definitions that can be used across different resource types. The label definitions are uniquely identified by their external id.
List of label definitions to create
required | Array of objects (ExternalLabelDefinition) [ 1 .. 1000 ] items unique |
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1
}
]
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "createdTime": 0
}
]
}
Delete all the label definitions specified by their external ids. The resource items that have the corresponding label attached remain unmodified. It is up to the client to clean up the resource items from their attached labels if necessary.
List of external ids of label definitions to delete.
required | Array of objects (LabelDefinitionExternalId) [ 1 .. 1000 ] items unique |
{- "items": [
- {
- "externalId": "my.known.id"
}
]
}
{ }
Use advanced filtering options to find label definitions.
object (Filter) Filter on labels definitions with strict matching. | |
cursor | string |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to return. |
{- "filter": {
- "name": "string",
- "externalIdPrefix": "string",
- "dataSetIds": [
- {
- "id": 1
}
]
}, - "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo",
- "limit": 100
}
{- "items": [
- {
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "createdTime": 0
}
], - "nextCursor": "string"
}
The relationships resource type represents connections between resource objects in CDF. Relationships allow you to organize assets in other structures in addition to the standard hierarchical asset structure. Each relationship is between a source and a target object and is defined by a relationship type and the external IDs and resource types of the source and target objects. Optionally, a relationship can be time-constrained with a start and end time. To define and manage the available relationship types, use the labels resource type. The externalId field uniquely identifies each relationship.
List of the relationships to create. You can create a maximum of 1000 relationships per request. Relationships should be unique, but CDF does not prevent you from creating duplicates where only the externalId differs.
Relationships are uniquely identified by their externalId. Non-unique relationships will not be created.
The order of relationships in the response equals the order in the request.
Data required to create relationships. You can request a maximum of 1000 relationships per request.
required | Array of objects (relationship) [ 1 .. 1000 ] items |
{- "items": [
- {
- "externalId": "string",
- "sourceExternalId": "string",
- "sourceType": "asset",
- "targetExternalId": "string",
- "targetType": "asset",
- "startTime": 0,
- "endTime": 0,
- "confidence": 1,
- "dataSetId": 1,
- "labels": [
- {
- "externalId": "my.known.id"
}
]
}
]
}
{- "items": [
- {
- "externalId": "string",
- "sourceExternalId": "string",
- "sourceType": "asset",
- "targetExternalId": "string",
- "targetType": "asset",
- "startTime": 0,
- "endTime": 0,
- "confidence": 1,
- "dataSetId": 1,
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Delete the relationships between resources identified by the external IDs in the request. You can delete a maximum of 1000 relationships per request.
Data required to delete relationships. You can delete a maximum of 1000 relationships per request.
required | Array of objects (itemsArray) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean (ignoreUnknownIds) Default: false Ignore external IDs that are not found. |
{- "items": [
- {
- "externalId": "string"
}
], - "ignoreUnknownIds": false
}
{ }
Lists relationships matching the query filter in the request. You can retrieve a maximum of 1000 relationships per request.
Data required to filter relationships. Combined filters are interpreted as an AND operation (not OR). Only relationships that match ALL the provided filters are returned.
object (advancedListFilter) Filter on relationships with exact match. Multiple filter elements in one property, for example | |
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to return. |
cursor | string |
fetchResources | boolean (fetchResources) Default: false If true, will try to fetch the resources referred to in the relationship, based on the users access rights. Will silently fail to attatch the resources if the user lacks access to some of them. |
partition | string (Partition) Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
{- "filter": {
- "sourceExternalIds": [
- "string"
], - "sourceTypes": [
- "asset"
], - "targetExternalIds": [
- "string"
], - "targetTypes": [
- "asset"
], - "dataSetIds": [
- {
- "id": 1
}
], - "startTime": {
- "max": 0,
- "min": 0
}, - "endTime": {
- "max": 0,
- "min": 0
}, - "confidence": {
- "min": 0,
- "max": 0
}, - "lastUpdatedTime": {
- "max": 0,
- "min": 0
}, - "createdTime": {
- "max": 0,
- "min": 0
}, - "activeAtTime": {
- "max": 0,
- "min": 0
}, - "labels": {
- "containsAny": [
- {
- "externalId": "my.known.id"
}
]
}, - "sourcesOrTargets": [
- {
- "type": "asset",
- "externalId": "string"
}
]
}, - "limit": 100,
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo",
- "fetchResources": false,
- "partition": "1/10"
}
{- "items": [
- {
- "externalId": "string",
- "sourceExternalId": "string",
- "sourceType": "asset",
- "targetExternalId": "string",
- "targetType": "asset",
- "startTime": 0,
- "endTime": 0,
- "confidence": 1,
- "dataSetId": 1,
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "createdTime": 0,
- "lastUpdatedTime": 0,
- "source": {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}, - "target": {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}
}
], - "nextCursor": "string"
}
Lists all relationships. The order of retrieved objects may change for two calls with the same parameters. The endpoint supports pagination. The initial call to this endpoint should not contain a cursor, but the cursor parameter should be used to retrieve further pages of results.
limit | integer [ 1 .. 1000 ] Default: 100 Limits the number of results to be returned. The maximum results returned by the server is 1000 even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
partition | string Example: partition=1/10 Splits the data set into To prevent unexpected problems and maximize read throughput, you should at most use 10 (N <= 10) partitions. When using more than 10 partitions, CDF may reduce the number of partitions silently.
For example, CDF may reduce the number of partitions to In future releases of the resource APIs, Cognite may reject requests if you specify more than 10 partitions. When Cognite enforces this behavior, the requests will result in a 400 Bad Request status. |
{- "items": [
- {
- "externalId": "string",
- "sourceExternalId": "string",
- "sourceType": "asset",
- "targetExternalId": "string",
- "targetType": "asset",
- "startTime": 0,
- "endTime": 0,
- "confidence": 1,
- "dataSetId": 1,
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "createdTime": 0,
- "lastUpdatedTime": 0
}
], - "nextCursor": "string"
}
Retrieve relationships by external IDs. You can retrieve a maximum of 1000 relationships per request. The order of the relationships in the response equals the order in the request.
Data required to list relationships.
required | Array of objects (itemsArray) [ 1 .. 1000 ] items |
ignoreUnknownIds | boolean (ignoreUnknownIds) Default: false Ignore external IDs that are not found. |
fetchResources | boolean (fetchResources) Default: false If true, will try to fetch the resources referred to in the relationship, based on the users access rights. Will silently fail to attatch the resources if the user lacks access to some of them. |
{- "items": [
- {
- "externalId": "string"
}
], - "ignoreUnknownIds": false,
- "fetchResources": false
}
{- "items": [
- {
- "externalId": "string",
- "sourceExternalId": "string",
- "sourceType": "asset",
- "targetExternalId": "string",
- "targetType": "asset",
- "startTime": 0,
- "endTime": 0,
- "confidence": 1,
- "dataSetId": 1,
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "createdTime": 0,
- "lastUpdatedTime": 0,
- "source": {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}, - "target": {
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "rootId": 1,
- "aggregates": {
- "childCount": 0,
- "depth": 0,
- "path": [
- {
- "id": 1
}
]
}, - "parentId": 1,
- "parentExternalId": "my.known.id",
- "externalId": "my.known.id",
- "name": "string",
- "description": "string",
- "dataSetId": 1,
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "source": "string",
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "geoLocation": {
- "type": "Feature",
- "geometry": {
- "type": "Point",
- "coordinates": [
- 0,
- 0
]
}, - "properties": { }
}, - "id": 1
}
}
]
}
Update relationships between resources according to the partial definitions of the relationships given in the payload of the request. This means that fields not mentioned in the payload will remain unchanged. Up to 1000 relationships can be updated in one operation.
To delete a value from an optional value the setNull
field should be set to true
.
The order of the updated relationships in the response equals the order in the request.
Data required to update relationships.
required | Array of objects (relationshipUpdate) [ 1 .. 1000 ] items |
{- "items": [
- {
- "externalId": "string",
- "update": {
- "sourceType": {
- "set": "asset"
}, - "sourceExternalId": {
- "set": "string"
}, - "targetType": {
- "set": "asset"
}, - "targetExternalId": {
- "set": "string"
}, - "confidence": {
- "set": 1
}, - "startTime": {
- "set": 0
}, - "endTime": {
- "set": 0
}, - "dataSetId": {
- "set": 1
}, - "labels": {
- "add": [
- {
- "externalId": "my.known.id"
}
], - "remove": [
- {
- "externalId": "my.known.id"
}
]
}
}
}
]
}
{- "items": [
- {
- "externalId": "string",
- "sourceExternalId": "string",
- "sourceType": "asset",
- "targetExternalId": "string",
- "targetType": "asset",
- "startTime": 0,
- "endTime": 0,
- "confidence": 1,
- "dataSetId": 1,
- "labels": [
- {
- "externalId": "my.known.id"
}
], - "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Annotations reflect contextual information in base CDF resource types, such as Files and Time series, that are not present on the object itself. The benefits of the annotations concept are threefold:
Creates the given annotations.
An annotation must reference an annotated resource.
The reference can be made by providing the internal ID of the annotated resource.
The annotation must have the status
field set to either "suggested", "rejected", or "approved"
The caller must have read-access on all the annotated resources, otherwise the call will fail.
The annotation type property must be set to one of the globally available annotation types.
See the documentation of the annotationType
and data
attributes for details.
The annotation data must conform to the schema provided by the annotation type.
The creating application and its version must be provided. The creating user must be provided, but if the
annotation is being created by a service, this can be set to null
.
A request for creating annotations
required | Array of objects (AnnotationsV2CreateSchema) [ 1 .. 1000 ] A list of annotations to create |
{- "items": [
- {
- "annotatedResourceType": "file",
- "annotatedResourceId": 1337,
- "annotationType": "images.ObjectDetection",
- "creatingApp": "cognite-vision",
- "creatingAppVersion": "1.2.1",
- "creatingUser": "john.doe@cognite.com",
- "data": {
- "assetRef": {
- "externalId": "abc"
}, - "symbolRegion": {
- "xMin": 0.1,
- "xMax": 0.2,
- "yMin": 0.1,
- "yMax": 0.2
}, - "textRegion": {
- "xMin": 0.2,
- "xMax": 0.3,
- "yMin": 0.2,
- "yMax": 0.3
}, - "pageNumber": 43
}, - "status": "approved"
}
]
}
{- "items": [
- {
- "id": 4096,
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "annotatedResourceType": "file",
- "annotatedResourceId": 1337,
- "annotationType": "images.ObjectDetection",
- "creatingApp": "cognite-vision",
- "creatingAppVersion": "1.2.1",
- "creatingUser": "john.doe@cognite.com",
- "data": {
- "assetRef": {
- "externalId": "abc"
}, - "symbolRegion": {
- "xMin": 0.1,
- "xMax": 0.2,
- "yMin": 0.1,
- "yMax": 0.2
}, - "textRegion": {
- "xMin": 0.2,
- "xMax": 0.3,
- "yMin": 0.2,
- "yMax": 0.3
}, - "pageNumber": 43
}, - "status": "approved"
}
]
}
Deletes the referenced annotations completely.
The caller must have read-access on all the annotated resources, otherwise the call will fail.
A request referencing existing annotations
required | Array of objects (AnnotationsV2ReferenceSchema) [ 1 .. 1000 ] A list of existing annotation references |
{- "items": [
- {
- "id": 4096
}
]
}
{- "error": {
- "code": 401,
- "message": "Could not authenticate.",
- "missing": [
- { }
], - "duplicated": [
- { }
]
}
}
Lists the annotations the caller has access to, based on a filter.
The caller must have read-access on the annotated resources listed in the filter, otherwise the call will fail.
A request specifying the annotation listing behavior
cursor | string or null [ 1 .. 255 ] characters A cursor pointing to another page of results |
limit | integer [ 1 .. 1000 ] Default: 25 |
required | object (AnnotationsV2FilterSchema) A filter to apply on annotations |
{- "cursor": "MzE1NjAwMDcxNzQ0ODI5",
- "limit": 25,
- "filter": {
- "annotatedResourceType": "file",
- "annotatedResourceIds": [
- {
- "id": 1066
}, - {
- "id": 1067
}
], - "status": "approved",
- "data": {
- "label": "cat"
}
}
}
{- "items": [
- {
- "id": 4096,
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "annotatedResourceType": "file",
- "annotatedResourceId": 1337,
- "annotationType": "images.ObjectDetection",
- "creatingApp": "cognite-vision",
- "creatingAppVersion": "1.2.1",
- "creatingUser": "john.doe@cognite.com",
- "data": {
- "assetRef": {
- "externalId": "abc"
}, - "symbolRegion": {
- "xMin": 0.1,
- "xMax": 0.2,
- "yMin": 0.1,
- "yMax": 0.2
}, - "textRegion": {
- "xMin": 0.2,
- "xMax": 0.3,
- "yMin": 0.2,
- "yMax": 0.3
}, - "pageNumber": 43
}, - "status": "approved"
}
], - "nextCursor": null
}
Retrieves the referenced annotation.
The caller must have read-access on the annotated resource, otherwise the call will fail.
annotationId required | integer <int64> (AnnotationId) [ 1 .. 9007199254740991 ] Example: 4096 The internal ID of the annotation |
{- "id": 4096,
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "annotatedResourceType": "file",
- "annotatedResourceId": 1337,
- "annotationType": "images.ObjectDetection",
- "creatingApp": "cognite-vision",
- "creatingAppVersion": "1.2.1",
- "creatingUser": "john.doe@cognite.com",
- "data": {
- "assetRef": {
- "externalId": "abc"
}, - "symbolRegion": {
- "xMin": 0.1,
- "xMax": 0.2,
- "yMin": 0.1,
- "yMax": 0.2
}, - "textRegion": {
- "xMin": 0.2,
- "xMax": 0.3,
- "yMin": 0.2,
- "yMax": 0.3
}, - "pageNumber": 43
}, - "status": "approved"
}
Retrieves the referenced annotations.
The caller must have read-access on all the annotated resources, otherwise the call will fail.
A request referencing existing annotations
required | Array of objects (AnnotationsV2ReferenceSchema) [ 1 .. 1000 ] A list of existing annotation references |
{- "items": [
- {
- "id": 4096
}
]
}
{- "items": [
- {
- "id": 4096,
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "annotatedResourceType": "file",
- "annotatedResourceId": 1337,
- "annotationType": "images.ObjectDetection",
- "creatingApp": "cognite-vision",
- "creatingAppVersion": "1.2.1",
- "creatingUser": "john.doe@cognite.com",
- "data": {
- "assetRef": {
- "externalId": "abc"
}, - "symbolRegion": {
- "xMin": 0.1,
- "xMax": 0.2,
- "yMin": 0.1,
- "yMax": 0.2
}, - "textRegion": {
- "xMin": 0.2,
- "xMax": 0.3,
- "yMin": 0.2,
- "yMax": 0.3
}, - "pageNumber": 43
}, - "status": "approved"
}
]
}
Suggests the given annotations, i.e. creates them with status
set to "suggested"
An annotation must reference an annotated resource.
The reference can be made by providing the internal ID of the annotated resource.
The caller must have read-access on all the annotated resources, otherwise the call will fail.
The annotation type property must be set to one of the globally available annotation types.
See the documentation of the annotationType
and data
attributes for details.
The annotation data must conform to the schema provided by the annotation type.
The creating application and its version must be provided. The creating user must be provided, but if the
annotation is being created by a service, this can be set to null
.
A request for suggesting annotations, i.e., creating them with the "suggested" status
required | Array of objects (AnnotationsV2SuggestSchema) [ 1 .. 1000 ] A list of annotations to suggest |
{- "items": [
- {
- "annotatedResourceType": "file",
- "annotatedResourceId": 1337,
- "annotationType": "images.ObjectDetection",
- "creatingApp": "cognite-vision",
- "creatingAppVersion": "1.2.1",
- "creatingUser": "john.doe@cognite.com",
- "data": {
- "assetRef": {
- "externalId": "abc"
}, - "symbolRegion": {
- "xMin": 0.1,
- "xMax": 0.2,
- "yMin": 0.1,
- "yMax": 0.2
}, - "textRegion": {
- "xMin": 0.2,
- "xMax": 0.3,
- "yMin": 0.2,
- "yMax": 0.3
}, - "pageNumber": 43
}
}
]
}
{- "items": [
- {
- "id": 4096,
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "annotatedResourceType": "file",
- "annotatedResourceId": 1337,
- "annotationType": "images.ObjectDetection",
- "creatingApp": "cognite-vision",
- "creatingAppVersion": "1.2.1",
- "creatingUser": "john.doe@cognite.com",
- "data": {
- "assetRef": {
- "externalId": "abc"
}, - "symbolRegion": {
- "xMin": 0.1,
- "xMax": 0.2,
- "yMin": 0.1,
- "yMax": 0.2
}, - "textRegion": {
- "xMin": 0.2,
- "xMax": 0.3,
- "yMin": 0.2,
- "yMax": 0.3
}, - "pageNumber": 43
}, - "status": "approved"
}
]
}
Updates the referenced annotations.
The caller must have read-access on all the annotated resources, otherwise the call will fail.
A request for updating existing annotations
required | Array of objects (AnnotationsV2UpdateItemSchema) [ 1 .. 1000 ] |
{- "items": [
- {
- "id": 4096,
- "update": {
- "data": {
- "set": {
- "assetRef": {
- "externalId": "abc"
}, - "symbolRegion": {
- "xMin": 0.1,
- "xMax": 0.2,
- "yMin": 0.1,
- "yMax": 0.2
}, - "textRegion": {
- "xMin": 0.4,
- "xMax": 0.5,
- "yMin": 0.4,
- "yMax": 0.5
}, - "pageNumber": 43
}
}
}
}
]
}
{- "items": [
- {
- "id": 4096,
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "annotatedResourceType": "file",
- "annotatedResourceId": 1337,
- "annotationType": "images.ObjectDetection",
- "creatingApp": "cognite-vision",
- "creatingAppVersion": "1.2.1",
- "creatingUser": "john.doe@cognite.com",
- "data": {
- "assetRef": {
- "externalId": "abc"
}, - "symbolRegion": {
- "xMin": 0.1,
- "xMax": 0.2,
- "yMin": 0.1,
- "yMax": 0.2
}, - "textRegion": {
- "xMin": 0.2,
- "xMax": 0.3,
- "yMin": 0.2,
- "yMax": 0.3
}, - "pageNumber": 43
}, - "status": "approved"
}
]
}
Transformations enable users to use Spark SQL queries to transform data from the CDF staging area, RAW, into the CDF data model.
project required | string The project name. |
externalId required | string The external ID provided by the client. Must be unique for the resource type. |
{- "externalId": "string"
}
{ }
Create a maximum of 1000 transformations per request.
project required | string The project name. |
Array of objects (TransformationCreate) <= 1000 items |
{- "items": [
- {
- "name": "string",
- "query": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "isPublic": true,
- "sourceOidcCredentials": {
- "clientId": "string",
- "clientSecret": "string",
- "scopes": "string",
- "tokenUri": "string",
- "cdfProjectName": "string",
- "audience": "string"
}, - "destinationOidcCredentials": {
- "clientId": "string",
- "clientSecret": "string",
- "scopes": "string",
- "tokenUri": "string",
- "cdfProjectName": "string",
- "audience": "string"
}, - "sourceNonce": {
- "sessionId": 0,
- "nonce": "string",
- "cdfProjectName": "string",
- "clientId": "string"
}, - "destinationNonce": {
- "sessionId": 0,
- "nonce": "string",
- "cdfProjectName": "string",
- "clientId": "string"
}, - "externalId": "string",
- "ignoreNullFields": true,
- "dataSetId": 0,
- "tags": [
- "string"
]
}
]
}
{- "items": [
- {
- "id": 0,
- "name": "string",
- "query": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "isPublic": true,
- "blocked": {
- "reason": "string",
- "createdTime": 0
}, - "createdTime": 0,
- "lastUpdatedTime": 0,
- "owner": "string",
- "ownerIsCurrentUser": true,
- "hasSourceOidcCredentials": true,
- "hasDestinationOidcCredentials": true,
- "sourceSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "destinationSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "lastFinishedJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "runningJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "schedule": {
- "id": 0,
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "interval": "string",
- "isPaused": true
}, - "externalId": "string",
- "ignoreNullFields": true,
- "dataSetId": 0,
- "tags": [
- "string"
]
}
]
}
Delete a maximum of 1000 transformations by ids and externalIds per request.
project required | string The project name. |
Array of TransformationCogniteExternalId (object) or TransformationCogniteInternalId (object) <= 1000 items | |
ignoreUnknownIds | boolean Ignore IDs and external IDs that are not found. Defaults to false. |
{- "items": [
- {
- "externalId": "string"
}
], - "ignoreUnknownIds": true
}
{ }
Filter transformations. Use nextCursor to paginate through the results.
project required | string The project name. |
required | object (TransformationFilter) |
limit | integer <int32> |
cursor | string |
{- "filter": {
- "isPublic": true,
- "nameRegex": "string",
- "queryRegex": "string",
- "destinationType": "string",
- "conflictMode": "abort",
- "hasBlockedError": true,
- "cdfProjectName": "string",
- "createdTime": {
- "min": 0,
- "max": 0
}, - "lastUpdatedTime": {
- "min": 0,
- "max": 0
}, - "dataSetIds": [
- {
- "externalId": "string"
}
], - "tags": {
- "containsAny": [
- "string"
]
}
}, - "limit": 0,
- "cursor": "string"
}
{- "items": [
- {
- "id": 0,
- "name": "string",
- "query": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "isPublic": true,
- "blocked": {
- "reason": "string",
- "createdTime": 0
}, - "createdTime": 0,
- "lastUpdatedTime": 0,
- "owner": "string",
- "ownerIsCurrentUser": true,
- "hasSourceOidcCredentials": true,
- "hasDestinationOidcCredentials": true,
- "sourceSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "destinationSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "lastFinishedJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "runningJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "schedule": {
- "id": 0,
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "interval": "string",
- "isPaused": true
}, - "externalId": "string",
- "ignoreNullFields": true,
- "dataSetId": 0,
- "tags": [
- "string"
]
}
], - "nextCursor": "string"
}
List transformations. Use nextCursor to paginate through the results.
project required | string The project name. |
limit | integer <int32> [ 1 .. 1000 ] Limits the number of results to be returned. The maximum is 1000, default limit is 100. |
cursor | string Cursor for paging through results. |
includePublic | boolean Whether public transformations should be included in the results. The default is true. |
withJobDetails | boolean Whether transformations should contain information about jobs. The default is true. |
transformations_list = client.transformations.list()
{- "items": [
- {
- "id": 0,
- "name": "string",
- "query": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "isPublic": true,
- "blocked": {
- "reason": "string",
- "createdTime": 0
}, - "createdTime": 0,
- "lastUpdatedTime": 0,
- "owner": "string",
- "ownerIsCurrentUser": true,
- "hasSourceOidcCredentials": true,
- "hasDestinationOidcCredentials": true,
- "sourceSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "destinationSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "lastFinishedJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "runningJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "schedule": {
- "id": 0,
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "interval": "string",
- "isPaused": true
}, - "externalId": "string",
- "ignoreNullFields": true,
- "dataSetId": 0,
- "tags": [
- "string"
]
}
], - "nextCursor": "string"
}
Retrieve a maximum of 1000 transformations by ids and externalIds per request.
project required | string The project name. |
Array of TransformationCogniteExternalId (object) or TransformationCogniteInternalId (object) <= 1000 items | |
ignoreUnknownIds | boolean Ignore IDs and external IDs that are not found. Defaults to false. |
withJobDetails required | boolean Whether the transformations will be returned with running job and last created job details. |
{- "items": [
- {
- "externalId": "string"
}
], - "ignoreUnknownIds": true,
- "withJobDetails": true
}
{- "items": [
- {
- "id": 0,
- "name": "string",
- "query": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "isPublic": true,
- "blocked": {
- "reason": "string",
- "createdTime": 0
}, - "createdTime": 0,
- "lastUpdatedTime": 0,
- "owner": "string",
- "ownerIsCurrentUser": true,
- "hasSourceOidcCredentials": true,
- "hasDestinationOidcCredentials": true,
- "sourceSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "destinationSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "lastFinishedJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "runningJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "schedule": {
- "id": 0,
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "interval": "string",
- "isPaused": true
}, - "externalId": "string",
- "ignoreNullFields": true,
- "dataSetId": 0,
- "tags": [
- "string"
]
}
]
}
project required | string The project name. |
externalId required | string |
object (NonceCredentials) |
{- "externalId": "string",
- "nonce": {
- "sessionId": 0,
- "nonce": "string",
- "cdfProjectName": "string",
- "clientId": "string"
}
}
{- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}
Update the attributes of transformations, maximum 1000 per request.
project required | string The project name. |
Array of UpdateItemWithExternalId_TransformationUpdate (object) or UpdateItemWithId_TransformationUpdate (object) <= 1000 items |
{- "items": [
- {
- "update": {
- "name": {
- "set": "string"
}, - "externalId": {
- "set": "string"
}, - "destination": {
- "set": {
- "type": "assets"
}
}, - "conflictMode": {
- "set": "abort"
}, - "query": {
- "set": "string"
}, - "sourceOidcCredentials": {
- "setNull": true
}, - "destinationOidcCredentials": {
- "setNull": true
}, - "sourceNonce": {
- "setNull": true
}, - "destinationNonce": {
- "setNull": true
}, - "isPublic": {
- "set": true
}, - "ignoreNullFields": {
- "set": true
}, - "dataSetId": {
- "setNull": true
}, - "tags": {
- "set": [
- "string"
]
}
}, - "externalId": "string"
}
]
}
{- "items": [
- {
- "id": 0,
- "name": "string",
- "query": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "isPublic": true,
- "blocked": {
- "reason": "string",
- "createdTime": 0
}, - "createdTime": 0,
- "lastUpdatedTime": 0,
- "owner": "string",
- "ownerIsCurrentUser": true,
- "hasSourceOidcCredentials": true,
- "hasDestinationOidcCredentials": true,
- "sourceSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "destinationSession": {
- "clientId": "string",
- "sessionId": 0,
- "projectName": "string"
}, - "lastFinishedJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "runningJob": {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}, - "schedule": {
- "id": 0,
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "interval": "string",
- "isPaused": true
}, - "externalId": "string",
- "ignoreNullFields": true,
- "dataSetId": 0,
- "tags": [
- "string"
]
}
]
}
project required | string The project name. |
id required | integer <int32> The job id. |
res = client.transformations.jobs.list_metrics(id=1)
{- "items": [
- {
- "timestamp": 0,
- "name": "assets.read",
- "count": 0
}
]
}
project required | string The project name. |
transformationId | integer <int32> List only jobs for the specified transformation. The transformation is identified by ID. |
transformationExternalId | string List only jobs for the specified transformation. The transformation is identified by external ID. |
limit | integer <int32> [ 1 .. 1000 ] Limits the number of results to be returned. The maximum is 1000, default limit is 100. |
cursor | string Cursor for paging through results. |
transformation_jobs_list = client.transformations.jobs.list() transformation_jobs_list = client.transformations.jobs.list(transformation_id = 1)
{- "items": [
- {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}
], - "nextCursor": "string"
}
Retrieve a maximum of 1000 jobs by ids per request.
project required | string The project name. |
Array of objects (TransformationCogniteInternalId) | |
ignoreUnknownIds | boolean Ignore IDs and external IDs that are not found. Defaults to false. |
{- "items": [
- {
- "id": 0
}
], - "ignoreUnknownIds": true
}
{- "items": [
- {
- "id": 0,
- "uuid": "string",
- "transformationId": 0,
- "transformationExternalId": "string",
- "sourceProject": "string",
- "destinationProject": "string",
- "destination": {
- "type": "assets"
}, - "conflictMode": "abort",
- "query": "string",
- "createdTime": 0,
- "startedTime": 0,
- "finishedTime": 0,
- "lastSeenTime": 0,
- "error": "string",
- "ignoreNullFields": true,
- "status": "Completed"
}
]
}
Transformation schedules allow you to run transformations with a specific input at intervals defined by a cron expression. These transformation jobs will be asynchronous and show up in the transformation job list. Visit http://www.cronmaker.com to generate a cron expression with a UI.
List all transformation schedules. Use nextCursor to paginate through the results.
project required | string The project name. |
limit | integer <int32> [ 1 .. 1000 ] Limits the number of results to be returned. The maximum is 1000, default limit is 100. |
cursor | string Cursor for paging through results. |
includePublic | boolean Whether public transformations should be included in the results. The default is true. |
schedules_list = client.transformations.schedules.list()
{- "items": [
- {
- "id": 0,
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "interval": "string",
- "isPaused": true
}
], - "nextCursor": "string"
}
Retrieve transformation schedules by transformation IDs or external IDs.
project required | string The project name. |
List of transformation IDs of schedules to retrieve. Must be up to a maximum of 1000 items and all of them must be unique.
Array of TransformationCogniteExternalId (object) or TransformationCogniteInternalId (object) <= 1000 items | |
ignoreUnknownIds | boolean Ignore IDs and external IDs that are not found. Defaults to false. |
{- "items": [
- {
- "externalId": "string"
}
], - "ignoreUnknownIds": true
}
{- "items": [
- {
- "id": 0,
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "interval": "string",
- "isPaused": true
}
]
}
Schedule transformations with the specified configurations.
project required | string The project name. |
List of the schedules to create. Must be up to a maximum of 1000 items.
Array of ScheduleParametersWithExternalId (object) or ScheduleParametersWithId (object) |
{- "items": [
- {
- "interval": "string",
- "isPaused": true,
- "externalId": "string"
}
]
}
{- "items": [
- {
- "id": 0,
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "interval": "string",
- "isPaused": true
}
]
}
Unschedule transformations by IDs or externalIds.
project required | string The project name. |
List of transformation IDs of schedules to delete. Must be up to a maximum of 1000 items and all of them must be unique.
Array of TransformationCogniteExternalId (object) or TransformationCogniteInternalId (object) <= 1000 items | |
ignoreUnknownIds | boolean Ignore IDs and external IDs that are not found. Defaults to false. |
{- "items": [
- {
- "externalId": "string"
}
], - "ignoreUnknownIds": true
}
{ }
project required | string The project name. |
List of schedule updates. Must be up to a maximum of 1000 items.
Array of UpdateItemWithExternalId_ScheduleUpdate (object) or UpdateItemWithId_ScheduleUpdate (object) |
{- "items": [
- {
- "update": {
- "interval": {
- "set": "string"
}, - "isPaused": {
- "set": true
}
}, - "externalId": "string"
}
]
}
{- "items": [
- {
- "id": 0,
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "interval": "string",
- "isPaused": true
}
]
}
Transformation notifications let users know when a job fails if subscribed.
Deletes the specified notification subscriptions on the transformation. Requests to delete non-existing subscriptions do nothing, but do not throw an error.
project required | string The project name. |
List of IDs to be deleted.
Array of objects (TransformationCogniteInternalId) |
{- "items": [
- {
- "id": 0
}
]
}
{ }
project required | string The project name. |
transformationId | integer <int32> List only notifications for the specified transformation. The transformation is identified by ID. |
transformationExternalId | string List only notifications for the specified transformation. The transformation is identified by external ID. |
destination | string Filter by notification destination. |
limit | integer <int32> [ 1 .. 1000 ] Limits the number of results to be returned. The maximum is 1000, default limit is 100. |
cursor | string Cursor for paging through results. |
notifications_list = client.transformations.notifications.list() notifications_list = client.transformations.notifications.list(transformation_id = 1)
{- "items": [
- {
- "id": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "transformationId": 0,
- "destination": "string"
}
], - "nextCursor": "string"
}
Subscribe for notifications on the transformation errors.
project required | string The project name. |
List of the notifications for new subscriptions. Must be up to a maximum of 1000 items.
Array of NotificationCreateWithExternalId (object) or NotificationCreateWithId (object) <= 1000 items |
{- "items": [
- {
- "destination": "string",
- "transformationExternalId": "string"
}
]
}
{- "items": [
- {
- "id": 0,
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "transformationId": 0,
- "destination": "string"
}
]
}
Preview a SQL query.
project required | string The project name. |
query required | string SQL query to run for preview. |
convertToString required | boolean Stringify values in the query results. |
limit | integer <int32> End-result limit of the query. |
sourceLimit | integer <int32> Limit for how many rows to download from the data sources. |
inferSchemaLimit | integer <int32> Limit for how many rows that are used for inferring schema. Default is 10,000. |
timeout | integer <int32> Number of seconds to wait before cancelling a query. The default, and maximum, is 240. |
{- "query": "string",
- "convertToString": true,
- "limit": 0,
- "sourceLimit": 0,
- "inferSchemaLimit": 0,
- "timeout": 0
}
{- "schema": {
- "items": [
- {
- "name": "string",
- "sqlType": "string",
- "type": {
- "type": "string"
}, - "nullable": true
}
]
}, - "results": {
- "items": [
- {
- "property1": "string",
- "property2": "string"
}
]
}
}
For View centric schema, viewSpace
, viewExternalId
, viewVersion
need to be specified while withInstanceSpace
, isConnectionDefinition
, instanceType
are optional . For Data Model centric schema, dataModelSpace
, dataModelExternalId
, dataModelVersion
, type
need to be specified and relationshipFromType
is optional. For Both scenarios conflictMode
is required.
project required | string The project name. |
conflictMode | string Enum: "abort" "upsert" "update" "delete" conflict mode of the transformation.
One of the following conflictMode types can be provided:
|
viewSpace | string non-empty Space of the View. Not required if |
viewExternalId | string non-empty External id of the View. Not required if |
viewVersion | string non-empty Version of the View. Not required if |
instanceType | string Enum: "nodes" "edges" Instance type to deal with |
withInstanceSpace | boolean Is instance space set at the transformation config or not |
isConnectionDefinition | boolean If the edge is a connection definition or not |
dataModelSpace | string non-empty Space of the Data Model. Relevant for Data Model centric schema. |
dataModelExternalId | string non-empty External id of the Data Model. Relevant for Data Model centric schema. |
dataModelVersion | string non-empty Version of the Data Model. Relevant for Data Model centric schema. |
type | string non-empty External id of the View in the Data model. Relevant for Data Model centric schema. |
relationshipFromType | string non-empty Property Identifier of Connection Definition in |
{- "items": [
- {
- "name": "string",
- "sqlType": "string",
- "type": {
- "type": "string"
}, - "nullable": true
}
]
}
project required | string The project name. |
externalId required | string non-empty External ID of the Sequence |
{- "items": [
- {
- "name": "string",
- "sqlType": "string",
- "type": {
- "type": "string"
}, - "nullable": true
}
]
}
project required | string The project name. |
schemaType required | string Name of the target schema type, please provide one of the following:
|
conflictMode | string One of the following conflictMode types can be provided:
|
from cognite.client.data_classes import TransformationDestination columns = client.transformations.schema.retrieve(destination = TransformationDestination.assets())
{- "items": [
- {
- "name": "string",
- "sqlType": "string",
- "type": {
- "type": "string"
}, - "nullable": true
}
]
}
Functions enables Python code to be hosted and executed in the cloud, on demand or by using a schedule. Execution, status and logs are available through the API. A function is uploaded to the Files API as a zip file with at least a Python file called handler.py
(unless specified otherwise through the functionPath
-argument) that must contain a function named handle
with any of the following arguments: data
, client
, secrets
, or 'function_call_info', which are passed into the function.
The latest version of Cognite SDK's are available, and additional python packages and version specifications can be defined in a requirements.txt
in the root of the zip.
Activate Cognite Functions. This will create the necessary backend infrastructure for Cognite Functions.
status = client.functions.activate()
{- "status": "inactive"
}
You can only create one function per request.
required | Array of objects (Function) = 1 items Array of functions to create. |
{- "items": [
- {
- "name": "My awesome function",
- "fileId": 5467347282343
}
]
}
{- "items": [
- {
- "id": 1,
- "createdTime": 123455234,
- "status": "Queued",
- "name": "myfunction",
- "externalId": "my.known.id",
- "fileId": 1,
- "owner": "user@cognite.com",
- "description": "My fantastic function with advanced ML",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "secrets": {
- "MySecret": "***"
}, - "functionPath": "myfunction/handler.py",
- "envVars": {
- "MyKey": "MyValue"
}, - "cpu": 0.25,
- "memory": 1,
- "runtime": "py38",
- "runtimeVersion": "Python 3.8.13",
- "error": {
- "code": 400,
- "message": "Could not build function."
}
}
]
}
Delete functions. You can delete a maximum of 10 functions per request. Function source files stored in the Files API must be deleted separately.
required | Array of FunctionId (object) or FunctionExternalId (object) (FunctionIdEither) [ 1 .. 10 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{ }
Use advanced filtering options to find functions.
object (FunctionFilter) | |
limit | integer <int32> >= 1 Default: 100 Limits the number of results to be returned. |
{- "filter": {
- "status": "Queued",
- "createdTime": {
- "min": 10,
- "max": 199
}
}
}
{- "items": [
- {
- "id": 1,
- "createdTime": 123455234,
- "status": "Queued",
- "name": "myfunction",
- "externalId": "my.known.id",
- "fileId": 1,
- "owner": "user@cognite.com",
- "description": "My fantastic function with advanced ML",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "secrets": {
- "MySecret": "***"
}, - "functionPath": "myfunction/handler.py",
- "envVars": {
- "MyKey": "MyValue"
}, - "cpu": 0.25,
- "memory": 1,
- "runtime": "py38",
- "runtimeVersion": "Python 3.8.13",
- "error": {
- "code": 400,
- "message": "Could not build function."
}
}
]
}
Service limits for the associated project.
limits = client.functions.limits()
{- "timeoutMinutes": 15,
- "cpuCores": {
- "min": 0.1,
- "max": 0.6,
- "default": 0.25
}, - "memoryGb": {
- "min": 0.1,
- "max": 2.5,
- "default": 1
}, - "runtimes": [
- "py37",
- "py38",
- "py39",
- "py310"
], - "responseSizeMb": 1
}
List functions.
limit | integer >= 1 Default: 100 Limits the number of results to be returned. |
{- "items": [
- {
- "id": 1,
- "createdTime": 123455234,
- "status": "Queued",
- "name": "myfunction",
- "externalId": "my.known.id",
- "fileId": 1,
- "owner": "user@cognite.com",
- "description": "My fantastic function with advanced ML",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "secrets": {
- "MySecret": "***"
}, - "functionPath": "myfunction/handler.py",
- "envVars": {
- "MyKey": "MyValue"
}, - "cpu": 0.25,
- "memory": 1,
- "runtime": "py38",
- "runtimeVersion": "Python 3.8.13",
- "error": {
- "code": 400,
- "message": "Could not build function."
}
}
]
}
Retrieve a function by its id. If you want to retrieve functions by names, use Retrieve functions instead.
functionId required | integer The function id. |
{- "id": 1,
- "createdTime": 123455234,
- "status": "Queued",
- "name": "myfunction",
- "externalId": "my.known.id",
- "fileId": 1,
- "owner": "user@cognite.com",
- "description": "My fantastic function with advanced ML",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "secrets": {
- "MySecret": "***"
}, - "functionPath": "myfunction/handler.py",
- "envVars": {
- "MyKey": "MyValue"
}, - "cpu": 0.25,
- "memory": 1,
- "runtime": "py38",
- "runtimeVersion": "Python 3.8.13",
- "error": {
- "code": 400,
- "message": "Could not build function."
}
}
Retrieve functions by ids.
required | Array of FunctionId (object) or FunctionExternalId (object) (FunctionIdEither) [ 1 .. 10 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "id": 1,
- "createdTime": 123455234,
- "status": "Queued",
- "name": "myfunction",
- "externalId": "my.known.id",
- "fileId": 1,
- "owner": "user@cognite.com",
- "description": "My fantastic function with advanced ML",
- "metadata": {
- "property1": "string",
- "property2": "string"
}, - "secrets": {
- "MySecret": "***"
}, - "functionPath": "myfunction/handler.py",
- "envVars": {
- "MyKey": "MyValue"
}, - "cpu": 0.25,
- "memory": 1,
- "runtime": "py38",
- "runtimeVersion": "Python 3.8.13",
- "error": {
- "code": 400,
- "message": "Could not build function."
}
}
]
}
Perform a function call. To provide input data to the function, add the data in an object called data
in the request body. It will be available as the data
argument into the function. Info about the function call at runtime can be obtained through the function_call_info
argument if added in the function handle. WARNING: Secrets or other confidential information should not be passed via the data
object. There is a dedicated secrets
object in the request body to "Create functions" for this purpose.
functionId required | integer The function id. |
data | object (data) Input data to the function (only present if provided on the schedule). This data is passed deserialized into the function through one of the arguments called |
nonce | string (nonce) Nonce retrieved from sessions API when creating a session. This will be used to bind the session before executing the function. The corresponding access token will be passed to the function and used to instantiate the client of the handle() function. You can create a session via the Sessions API. When using the Python SDK, the session will be created behind the scenes when creating the schedule. |
{- "data": {
- "timeSeriesId1": 13435351,
- "maxValue": 4
}, - "nonce": "string"
}
{- "id": 1,
- "status": "Running",
- "startTime": 0,
- "endTime": 0,
- "error": {
- "trace": "Cannot assign foo to bar.",
- "message": "Could not authenticate."
}, - "scheduleId": 1,
- "functionId": 1,
- "scheduledTime": 0
}
Use advanced filtering options to find function calls.
functionId required | integer The function id. |
object (FunctionCallFilter) | |
limit | integer <int32> >= 1 Default: 100 Limits the number of results to be returned. |
cursor | string |
{- "filter": {
- "status": "Running",
- "scheduleId": 123,
- "startTime": {
- "min": 1234,
- "max": 5678
}
}, - "limit": 10
}
{- "items": [
- {
- "id": 1,
- "status": "Running",
- "startTime": 0,
- "endTime": 0,
- "error": {
- "trace": "Cannot assign foo to bar.",
- "message": "Could not authenticate."
}, - "scheduleId": 1,
- "functionId": 1,
- "scheduledTime": 0
}
], - "nextCursor": "string"
}
List function calls.
functionId required | integer The function id. |
limit | integer >= 1 Default: 100 Limits the number of results to be returned. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
logs = client.functions.calls.get_logs(call_id=2, function_id=1) call = client.functions.calls.retrieve(call_id=2, function_id=1) logs = call.get_logs()
{- "items": [
- {
- "id": 1,
- "status": "Running",
- "startTime": 0,
- "endTime": 0,
- "error": {
- "trace": "Cannot assign foo to bar.",
- "message": "Could not authenticate."
}, - "scheduleId": 1,
- "functionId": 1,
- "scheduledTime": 0
}
], - "nextCursor": "string"
}
Retrieve function calls.
callId required | integer The function call id. |
functionId required | integer The function id. |
{- "id": 1,
- "status": "Running",
- "startTime": 0,
- "endTime": 0,
- "error": {
- "trace": "Cannot assign foo to bar.",
- "message": "Could not authenticate."
}, - "scheduleId": 1,
- "functionId": 1,
- "scheduledTime": 0
}
Retrieve function calls by call ids.
functionId required | integer The function id. |
List of IDs of calls to retrieve. Must be up to a maximum of 10000 items and all of them must be unique.
required | Array of objects (FunctionCallId) [ 1 .. 10000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "id": 1,
- "status": "Running",
- "startTime": 0,
- "endTime": 0,
- "error": {
- "trace": "Cannot assign foo to bar.",
- "message": "Could not authenticate."
}, - "scheduleId": 1,
- "functionId": 1,
- "scheduledTime": 0
}
]
}
Get logs from a function call.
callId required | integer The function call id. |
functionId required | integer The function id. |
{- "items": [
- {
- "timestamp": 1585350274000,
- "message": "Did do fancy thing"
}, - {
- "timestamp": 1585350276000,
- "message": "Did do another fancy thing"
}
]
}
Retrieve response from a function call.
callId required | integer The function call id. |
functionId required | integer The function id. |
response = client.functions.calls.get_response(call_id=2, function_id=1) call = client.functions.calls.retrieve(call_id=2, function_id=1) response = call.get_response()
{- "response": {
- "numAssets": 1234,
- "someCalculation": 3.14
}
}
Function schedules allow you to run functions with a specific input at intervals defined by a cron expression. These function calls will be asynchronous and show up in the function call list. Visit http://www.cronmaker.com to generate a cron expression with a UI.
Create function schedules. Function schedules trigger asynchronous calls with specific input data, based on a cron expression that determines when these triggers should be fired. Use e.g. http://www.cronmaker.com to be guided on how to generate a cron expression. One of FunctionId
or FunctionExternalId
(deprecated) must be set (but not both). When creating a schedule with a session, i.e. with a nonce
, FunctionId
must be used. The nonce
will be used to bind the session before function execution, and the session will be kept alive for the lifetime of the schedule. WARNING: Secrets or other confidential information should not be passed via the data
object. There is a dedicated secrets
object in the request body to "Create functions" for this purpose.
required | Array of objects (FunctionSchedule) = 1 items |
{- "items": [
- {
- "name": "My schedule",
- "description": "This is a nice schedule",
- "cronExpression": "* * * * *",
- "functionId": 1,
- "functionExternalId": "my.known.id",
- "data": {
- "timeSeriesId1": 13435351,
- "maxValue": 4
}, - "nonce": "string"
}
]
}
{- "items": [
- {
- "id": 1,
- "name": "My schedule",
- "createdTime": 0,
- "description": "This is a nice schedule",
- "cronExpression": "* * * * *",
- "when": "Every hour",
- "functionId": 1,
- "functionExternalId": "my.known.id",
- "sessionId": "string"
}
]
}
Delete function schedules.
required | Array of objects (FunctionScheduleId) [ 1 .. 10000 ] items |
{- "items": [
- {
- "id": 1
}
]
}
{ }
Use advanced filtering options to find function schedules. At most one of FunctionId
or FunctionExternalId
can be specified.
object (FunctionScheduleFilter) | |
limit | integer <int32> >= 1 Default: 100 Limits the number of results to be returned. |
{- "filter": {
- "name": "MySchedule",
- "cronExpression": "5 4 * * *"
}
}
{- "items": [
- {
- "id": 1,
- "name": "My schedule",
- "createdTime": 0,
- "description": "This is a nice schedule",
- "cronExpression": "* * * * *",
- "when": "Every hour",
- "functionId": 1,
- "functionExternalId": "my.known.id",
- "sessionId": "string"
}
]
}
List function schedules in project.
limit | integer >= 1 Default: 100 Limits the number of results to be returned. |
{- "items": [
- {
- "id": 1,
- "name": "My schedule",
- "createdTime": 0,
- "description": "This is a nice schedule",
- "cronExpression": "* * * * *",
- "when": "Every hour",
- "functionId": 1,
- "functionExternalId": "my.known.id",
- "sessionId": "string"
}
]
}
Retrieve a function schedule by its id.
scheduleId required | integer The function schedule id. |
{- "id": 1,
- "name": "My schedule",
- "createdTime": 0,
- "description": "This is a nice schedule",
- "cronExpression": "* * * * *",
- "when": "Every hour",
- "functionId": 1,
- "functionExternalId": "my.known.id",
- "sessionId": "string"
}
Retrieve the input data to the associated function.
scheduleId required | integer The function schedule id. |
client.functions.schedules.get_input_data(id=123)
{- "id": 1,
- "data": {
- "timeSeriesId1": 13435351,
- "maxValue": 4
}
}
Retrieve function schedules by schedule ids.
List of IDs of schedules to retrieve. Must be up to a maximum of 10000 items and all of them must be unique.
required | Array of objects [ 1 .. 10000 ] items |
ignoreUnknownIds | boolean Default: false Ignore IDs and external IDs that are not found |
{- "items": [
- {
- "id": 1
}
], - "ignoreUnknownIds": false
}
{- "items": [
- {
- "id": 1,
- "name": "My schedule",
- "createdTime": 0,
- "description": "This is a nice schedule",
- "cronExpression": "* * * * *",
- "when": "Every hour",
- "functionId": 1,
- "functionExternalId": "my.known.id",
- "sessionId": "string"
}
]
}
Use the Data Model Storage API to manage your data in CDF.
You define a data model with views and data containers. Data containers specify how the service stores data.
Views define the logical view of the stored data. Nodes and edges are representations of data containers.
We call these node and edge representations "instances". Each node or edge can hold one or more data
containers. You can query nodes and edges using views. Views combine data from one or more data containers.
A container represents a bag of properties. These properties have a type. They also have optional constraints, default values, and you can choose to index them.
Each data container has a default view available so you can query all its properties.
A data model is a collection (set) of views. Use the data model to group and structure views into a recognizable, and understood model. The model represents a reusable collection of data.
Each data model has a version. A data model can also import other data models, so you can reuse views.
All resources described have to belong to a space. You use a space to create a logical grouping of your resources, and to act as a governance boundary. A space defines a namespace.
You use the space (name) and the external-id - externalId - for the resource to identify it. For view or container properties, the external-id for the view - or container - is part of the identifier. This means you can have resources using the exact external-id in two different spaces.
Add or update (upsert) data models. For unchanged data model specifications, the operation completes without making any changes. We will not update the lastUpdatedTime
value for models that remain unchanged.
List of data models to add
required | Array of objects (DataModelCreate) [ 1 .. 100 ] items List of data models to create/update |
{- "items": [
- {
- "space": "string",
- "externalId": "string",
- "name": "string",
- "description": "string",
- "version": "string",
- "views": [
- {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
]
}
]
}
{- "items": [
- {
- "space": "string",
- "externalId": "string",
- "name": "string",
- "description": "string",
- "version": "string",
- "views": [
- {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
], - "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
]
}
Delete one or more data models. Currently limited to 100 models at a time. This does not delete the views, nor the containers they reference.
List of references to data models you wish to delete
required | Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "version": "string"
}
]
}
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "version": "string"
}
]
}
List data models in a project when they match a given filter.
Filter based on name, externalIds, versions and space. It also supports sorting and pagination.
cursor | string |
limit | integer [ 1 .. 1000 ] Default: 10 |
spaces | Array of strings (SpaceSpecification) [ items [ 1 .. 43 ] characters ^[a-zA-Z][a-zA-Z0-9_-]{0,41}[a-zA-Z0-9]?$ ] List of spaces you want to use to limit the returned matches by |
allVersions | boolean (AllVersionsFlag) Default: false If all versions of the entity should be returned. Defaults to false which returns the latest version, attributed to the newest 'createdTime' field |
includeGlobal | boolean (IncludeGlobalFlag) Default: false If the global items of the entity should be returned. Defaults to false which excludes global items. |
{- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo",
- "limit": 10,
- "spaces": [
- "string"
], - "allVersions": false,
- "includeGlobal": false
}
{- "items": [
- {
- "space": "string",
- "externalId": "string",
- "name": "string",
- "description": "string",
- "version": "string",
- "views": [
- {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
], - "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
], - "nextCursor": "string"
}
List data models defined in the project. You can filter the returned models by the specified space.
limit | integer [ 1 .. 1000 ] Default: 10 Limit the number of results returned. The largest result-set returned by the server will be 1000 items, even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
inlineViews | boolean Default: false Should we expand the referenced views inline in the returned result. |
space | string [ 1 .. 43 ] characters ^[a-zA-Z0-9][a-zA-Z0-9_-]{0,41}[a-zA-Z0-9]?$ Example: space=timeseries The space to query. |
allVersions | boolean Default: false If all versions of the entity should be returned. Defaults to false which returns the latest version, attributed to the newest 'createdTime' field |
includeGlobal | boolean Default: false If the global items of the entity should be returned. Defaults to false which excludes global items. |
data_model_list = client.data_modeling.data_models.list(limit=5) for data_model in client.data_modeling.data_models: data_model # do something with the data_model for data_model_list in client.data_modeling.data_models(chunk_size=10): data_model_list # do something with the data model
{- "items": [
- {
- "space": "string",
- "externalId": "string",
- "name": "string",
- "description": "string",
- "version": "string",
- "views": [
- {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
], - "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
], - "nextCursor": "string"
}
Retrieve up to 100 data models by their external ids. Views can be auto-expanded when the InlineViews
query parameter is set.
inlineViews | boolean Default: false Should we expand the referenced views inline in the returned result. |
List of external-ids of data models to retrieve.
required | Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "version": "string"
}
]
}
{- "items": [
- {
- "space": "string",
- "externalId": "string",
- "name": "string",
- "description": "string",
- "version": "string",
- "views": [
- {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
], - "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
]
}
Add or update (upsert) spaces. For unchanged space specifications, the operation completes without making any changes. We will not update the lastUpdatedTime
value for spaces that remain unchanged.
Spaces to add or update.
required | Array of objects (SpaceCreateDefinition) [ 1 .. 100 ] items List of spaces to create/update |
{- "items": [
- {
- "space": "string",
- "description": "string",
- "name": "string"
}
]
}
{- "items": [
- {
- "space": "string",
- "description": "string",
- "name": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
]
}
Delete one or more spaces. Currently limited to 100 spaces at a time.
If an existing data model references a space, you cannot delete that space. Nodes, edges and other data types that are part of a space will no longer be available.
List of space-ids for spaces to delete.
required | Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "space": "string"
}
]
}
{- "items": [
- {
- "space": "string"
}
]
}
List spaces defined in the current project.
limit | integer [ 1 .. 1000 ] Default: 10 Limit the number of results returned. The largest result-set returned by the server will be 1000 items, even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
includeGlobal | boolean Default: false If the global items of the entity should be returned. Defaults to false which excludes global items. |
space_list = client.data_modeling.spaces.list(limit=5) for space in client.data_modeling.spaces: space # do something with the space for space_list in client.data_modeling.spaces(chunk_size=2500): space_list # do something with the spaces
{- "items": [
- {
- "space": "string",
- "description": "string",
- "name": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
], - "nextCursor": "string"
}
Retrieve up to 100 spaces by specifying their space-ids.
List of space-ids for the spaces to return.
required | Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "space": "string"
}
]
}
{- "items": [
- {
- "space": "string",
- "description": "string",
- "name": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
]
}
Add or update (upsert) views. For unchanged view specifications, the operation completes without making any changes. We will not update the lastUpdatedTime
value for views that remain unchanged.
Views to add or update.
required | Array of objects (ViewCreateDefinition) [ 1 .. 100 ] items List of views to create/update |
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "name": "string",
- "description": "string",
- "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}, - "implements": [
- {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
], - "version": "string",
- "properties": {
- "property-identifier1": {
- "name": "string",
- "description": "string",
- "container": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}, - "containerPropertyIdentifier": "string",
- "source": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
}, - "property-identifier2": {
- "name": "string",
- "description": "string",
- "container": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}, - "containerPropertyIdentifier": "string",
- "source": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
}
}
}
]
}
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "name": "string",
- "description": "string",
- "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}, - "implements": [
- {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
], - "version": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "writable": true,
- "usedFor": "node",
- "isGlobal": true,
- "properties": {
- "view-or-relation-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}, - "container": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}, - "containerPropertyIdentifier": "string"
}, - "view-or-relation-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}, - "container": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}, - "containerPropertyIdentifier": "string"
}
}
}
]
}
Delete one or more views. Currently limited to 100 views at a time. The service cannot delete a view referenced by a data model.
List of references to views you want to delete.
required | Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "version": "string"
}
]
}
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "version": "string"
}
]
}
List of views defined in the current project. You can filter the list by specifying a space.
limit | integer [ 1 .. 1000 ] Default: 10 Limit the number of results returned. The largest result-set returned by the server will be 1000 items, even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
space | string [ 1 .. 43 ] characters ^[a-zA-Z0-9][a-zA-Z0-9_-]{0,41}[a-zA-Z0-9]?$ Example: space=timeseries The space to query. |
includeInheritedProperties | boolean Default: true Include properties inherited from views this view implements. |
allVersions | boolean Default: false If all versions of the entity should be returned. Defaults to false which returns the latest version, attributed to the newest 'createdTime' field |
includeGlobal | boolean Default: false If the global items of the entity should be returned. Defaults to false which excludes global items. |
view_list = client.data_modeling.views.list(limit=5) for view in client.data_modeling.views: view # do something with the view for view_list in client.data_modeling.views(chunk_size=10): view_list # do something with the views
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "name": "string",
- "description": "string",
- "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}, - "implements": [
- {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
], - "version": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "writable": true,
- "usedFor": "node",
- "isGlobal": true,
- "properties": {
- "view-or-relation-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}, - "container": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}, - "containerPropertyIdentifier": "string"
}, - "view-or-relation-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}, - "container": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}, - "containerPropertyIdentifier": "string"
}
}
}
], - "nextCursor": "string"
}
Retrieve up to 100 views by their external ids.
includeInheritedProperties | boolean Default: true Include properties inherited from views this view implements. |
List of external-ids of views to retrieve.
required | Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "version": "string"
}
]
}
{- "items": [
- {
- "externalId": "string",
- "space": "string",
- "name": "string",
- "description": "string",
- "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}, - "implements": [
- {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
], - "version": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "writable": true,
- "usedFor": "node",
- "isGlobal": true,
- "properties": {
- "view-or-relation-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}, - "container": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}, - "containerPropertyIdentifier": "string"
}, - "view-or-relation-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}, - "container": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}, - "containerPropertyIdentifier": "string"
}
}
}
]
}
Add or update (upsert) containers. For unchanged container specifications, the operation completes without making any changes. We will not update the lastUpdatedTime
value for containers that remain unchanged.
Containers to add or update.
required | Array of objects (ContainerCreateDefinition) [ 1 .. 100 ] items List of containers to create/update |
{- "items": [
- {
- "space": "string",
- "externalId": "string",
- "name": "string",
- "description": "string",
- "usedFor": "node",
- "properties": {
- "containerPropertyIdentifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "containerPropertyIdentifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "constraints": {
- "constraint-identifier1": {
- "constraintType": "requires",
- "require": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}
}, - "constraint-identifier2": {
- "constraintType": "requires",
- "require": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}
}
}, - "indexes": {
- "index-identifier1": {
- "properties": [
- "string"
], - "indexType": "btree",
- "cursorable": false
}, - "index-identifier2": {
- "properties": [
- "string"
], - "indexType": "btree",
- "cursorable": false
}
}
}
]
}
{- "items": [
- {
- "space": "string",
- "externalId": "string",
- "name": "string",
- "description": "string",
- "usedFor": "node",
- "properties": {
- "containerPropertyIdentifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "containerPropertyIdentifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "constraints": {
- "constraint-identifier1": {
- "constraintType": "requires",
- "require": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}
}, - "constraint-identifier2": {
- "constraintType": "requires",
- "require": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}
}
}, - "indexes": {
- "index-identifier1": {
- "properties": [
- "string"
], - "indexType": "btree",
- "cursorable": false
}, - "index-identifier2": {
- "properties": [
- "string"
], - "indexType": "btree",
- "cursorable": false
}
}, - "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
]
}
Delete one or more container constraints. Currently limited to 10 constraints at a time.
List of the references to constraints you want to delete.
required | Array of objects [ 1 .. 10 ] items |
{- "items": [
- {
- "space": "string",
- "containerExternalId": "string",
- "identifier": "string"
}
]
}
{- "items": [
- {
- "space": "string",
- "containerExternalId": "string",
- "identifier": "string"
}
]
}
Delete one or more containers. Currently limited to 100 containers at a time. You cannot delete a container when one or more data model(s) or view(s) references it.
List of the spaces and external-ids for the containers you want to delete.
required | Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "string",
- "space": "string"
}
]
}
{- "items": [
- {
- "externalId": "string",
- "space": "string"
}
]
}
Delete one or more container indexes. Currently limited to 10 indexes at a time.
List of the references to indexes you want to delete.
required | Array of objects [ 1 .. 10 ] items |
{- "items": [
- {
- "space": "string",
- "containerExternalId": "string",
- "identifier": "string"
}
]
}
{- "items": [
- {
- "space": "string",
- "containerExternalId": "string",
- "identifier": "string"
}
]
}
List of containers defined in the current project. You can filter the list by specifying a space.
limit | integer [ 1 .. 1000 ] Default: 10 Limit the number of results returned. The largest result-set returned by the server will be 1000 items, even if you specify a higher limit. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor for paging through results. |
space | string [ 1 .. 43 ] characters ^[a-zA-Z0-9][a-zA-Z0-9_-]{0,41}[a-zA-Z0-9]?$ Example: space=timeseries The space to query. |
includeGlobal | boolean Default: false If the global items of the entity should be returned. Defaults to false which excludes global items. |
container_list = client.data_modeling.containers.list(limit=5) for container in client.data_modeling.containers: container # do something with the container for container_list in client.data_modeling.containers(chunk_size=10): container_list # do something with the containers
{- "items": [
- {
- "space": "string",
- "externalId": "string",
- "name": "string",
- "description": "string",
- "usedFor": "node",
- "properties": {
- "containerPropertyIdentifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "containerPropertyIdentifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "constraints": {
- "constraint-identifier1": {
- "constraintType": "requires",
- "require": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}
}, - "constraint-identifier2": {
- "constraintType": "requires",
- "require": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}
}
}, - "indexes": {
- "index-identifier1": {
- "properties": [
- "string"
], - "indexType": "btree",
- "cursorable": false
}, - "index-identifier2": {
- "properties": [
- "string"
], - "indexType": "btree",
- "cursorable": false
}
}, - "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
], - "nextCursor": "string"
}
Retrieve up to 100 containers by their specified external ids.
List of external-ids of containers to retrieve.
required | Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "string",
- "space": "string"
}
]
}
{- "items": [
- {
- "space": "string",
- "externalId": "string",
- "name": "string",
- "description": "string",
- "usedFor": "node",
- "properties": {
- "containerPropertyIdentifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "containerPropertyIdentifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "constraints": {
- "constraint-identifier1": {
- "constraintType": "requires",
- "require": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}
}, - "constraint-identifier2": {
- "constraintType": "requires",
- "require": {
- "type": "container",
- "space": "string",
- "externalId": "string"
}
}
}, - "indexes": {
- "index-identifier1": {
- "properties": [
- "string"
], - "indexType": "btree",
- "cursorable": false
}, - "index-identifier2": {
- "properties": [
- "string"
], - "indexType": "btree",
- "cursorable": false
}
}, - "createdTime": 0,
- "lastUpdatedTime": 0,
- "isGlobal": true
}
]
}
Aggregate data for nodes or edges in a project. You can use an optional query or filter specification to limit the result.
Aggregation specification.
query | string Optional query string. The API will parse the query string, and use it to match the text properties on elements to use for the aggregate(s). |
properties | Array of strings [ 1 .. 200 ] items Optional list (array) of properties you want to apply the query above to. If you do not list any properties, you search through text fields by default. |
limit | integer [ 1 .. 1000 ] Default: 100 Limit the number of results returned. The default limit is currently at 100 items. |
Array of avg (object) or count (object) or min (object) or max (object) or sum (object) or histogram (object) (AggregationDefinition) <= 5 items | |
groupBy | Array of strings [ 1 .. 5 ] items The selection of fields to group the results by when doing aggregations. You can specify up to 5 items to group by. When you do not specify any aggregates, the fields listed in the |
(BoolFilter (and (object) or or (object) or not (object))) or (LeafFilter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or matchAll (object) or nested (object) or overlaps (object) or hasData (object))) (FilterDefinition) A filter Domain Specific Language (DSL) used to create advanced filter queries. | |
instanceType | string (InstanceType) Default: "node" Enum: "node" "edge" The type of instance |
required | object (ViewReference) Reference to a view |
{- "query": "string",
- "properties": [
- "string"
], - "limit": 100,
- "aggregates": [
- {
- "avg": {
- "property": "string"
}
}
], - "groupBy": [
- "string"
], - "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}, - "instanceType": "node",
- "view": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
}
{- "items": [
- {
- "instanceType": "node",
- "group": {
- "name": "PumpName1",
- "tag": "tag01"
}, - "aggregates": [
- {
- "aggregate": "avg",
- "property": "duration",
- "value": 0.2
}
]
}
], - "typing": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}
}
Create or update nodes and edges in a transaction. The items
field of the payload is an array of objects
where each object describes a node or an edge to create, patch or replace. The instanceType
field of
each object must be node
or edge
and determines how the rest of the object is interpreted.
This operation is currently limited to 1000 nodes and/or edges at a time.
Individual nodes and edges are uniquely identified by their externalId and space.
When there is no node or edge with the given externalId in the given space, a node will be created and the
properties provided for each of the containers or views in the sources
array will be populated for the
node/edge. Nodes can also be created implicitly when an edge between them is created (if
autoCreateStartNodes
and/or autoCreateEndNodes
is set), or when a direct relation
property is set, the target node does not exist and autoCreateDirectRelations
is set.
To add a node or edge, the user must have capabilities to access (write to) both the view(s) referenced in
sources
and the container(s) underlying these views, as well as any directly referenced containers.
When a node or edge (instance) with the given externalId already exists in a space, the
properties named in the sources
field will be written to the instance. Other properties will remain
unchanged. To replace the whole set of properties for an instance (a node or an edge) rather than patch the
instance, set the replace
parameter to true
.
If you use a writable view to update properties (that is, the source you are referring to in sources
is a view), you must have write access to the view as well as all of its backing containers.
When a node/edge item has no changes compared to the existing instance - that is, when the supplied property
values are equal to the corresponding values in the existing node/edge, the node/edge will stay unchanged.
In this case, the lastUpdatedTime
values for the nodes/edges in question will not change.
Nodes/edges to add or update.
required | Array of NodeWrite (object) or EdgeWrite (object) (NodeOrEdgeCreate) [ 1 .. 1000 ] items List of nodes and edges to create/update |
autoCreateDirectRelations | boolean Default: true Should we create missing target nodes of direct relations? If the target-container constraint has been specified for a direct relation, the target node cannot be auto-created. If you want to point direct relations to a space where you have only read access, this option must be set to false. |
autoCreateStartNodes | boolean Default: false Should we create missing start nodes for edges when ingesting? By default, the start node of an edge must exist before we can ingest the edge. |
autoCreateEndNodes | boolean Default: false Should we create missing end nodes for edges when ingesting? By default, the end node of an edge must exist before we can ingest the edge. |
skipOnVersionConflict | boolean Default: false If existingVersion is specified on any of the nodes/edges in the input, the default behaviour is that the entire ingestion will fail when version conflicts occur. If skipOnVersionConflict is set to true, items with version conflicts will be skipped instead. If no version is specified for nodes/edges, it will do the write directly. |
replace | boolean Default: false How do we behave when a property value exists? Do we replace all matching and existing values with the supplied values ( |
{- "items": [
- {
- "instanceType": "node",
- "existingVersion": 0,
- "space": "string",
- "externalId": "string",
- "sources": [
- {
- "source": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "properties": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}
]
}
], - "autoCreateDirectRelations": true,
- "autoCreateStartNodes": false,
- "autoCreateEndNodes": false,
- "skipOnVersionConflict": false,
- "replace": false
}
{- "items": [
- {
- "instanceType": "node",
- "version": 0,
- "wasModified": true,
- "space": "string",
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0
}
]
}
Delete nodes and edges in a transaction. Limited to 1000 nodes/edges at a time.
When a node is selected for deletion, all connected incoming and outgoing edges that point to or from it are also deleted. However, please note that the operation might fail if the node has a high number of edge connections. If this is the case, consider deleting the edges connected to the node before deleting the node itself.
List of types, spaces, and external-ids for nodes and edges to delete.
required | Array of objects [ 1 .. 1000 ] items |
{- "items": [
- {
- "instanceType": "node",
- "externalId": "string",
- "space": "string"
}
]
}
{- "items": [
- {
- "instanceType": "node",
- "externalId": "string",
- "space": "string"
}
]
}
Filter the instances - nodes and edges - in a project.
Filter based on the instance type, the name, the external-ids, and on properties. The filter supports sorting and pagination. Properties for up to 10 views can be retrieved in one query.
includeTyping | boolean (IncludeTyping) Default: false Should we return property type information as part of the result? |
Array of objects (SourceSelectorWithoutPropertiesV3) <= 10 items Retrieve properties from the listed - by reference - views. | |
instanceType | string Default: "node" Enum: "node" "edge" The type of instance you are querying for; an edge or a node. If the instance type isn't specified, we list nodes. |
cursor | string |
limit | integer [ 1 .. 1000 ] Default: 1000 Limits the number of results to return. |
Array of objects (PropertySortV3) <= 5 items | |
(BoolFilter (and (object) or or (object) or not (object))) or (LeafFilter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or matchAll (object) or nested (object) or overlaps (object) or hasData (object))) (FilterDefinition) A filter Domain Specific Language (DSL) used to create advanced filter queries. |
{- "includeTyping": false,
- "sources": [
- {
- "source": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
}
], - "instanceType": "node",
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo",
- "limit": 1000,
- "sort": [
- {
- "property": [
- "string"
], - "direction": "ascending",
- "nullsFirst": false
}
], - "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}
}
{- "items": [
- {
- "instanceType": "node",
- "version": 0,
- "space": "string",
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "deletedTime": 0,
- "properties": {
- "space-name1": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}, - "space-name2": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}
}
}
], - "typing": {
- "space-name1": {
- "view-or-container-external-id1": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "view-or-container-external-id2": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}
}, - "space-name2": {
- "view-or-container-external-id1": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "view-or-container-external-id2": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}
}
}, - "nextCursor": "string"
}
The Data Modelling API exposes an advanced query interface. The query interface supports parameterization, recursive edge traversal, chaining of result sets, and granular property selection.
A query is composed of a with section defining result set expressions that describe the input to the query, a set of optional parameter placeholders if the query is parameterized, and then the select section that defines which properties are to be returned back as part of the result.
Imagine you have a data set with airplanes and airports, represented as two sets of nodes with edges between them indicating in which airports the airplanes land. Here is an example of a query which fetches a specific airplane as well as the airports it lands in:
with:
airplanes:
nodes:
filter:
equals:
property: ["node", "externalId"]
value: {"parameter": "airplaneExternalId"}
limit: 1
lands_in_airports:
edges:
from: airplanes
maxDistance: 1
direction: outwards
filter:
equals:
property: ["edge", "type"]
value: ["aviation", "lands-in"]
airports:
nodes:
from: lands_in_airports
parameters:
airplaneExternalId: myFavouriteAirplane
select:
airplanes: {}
airports: {}
Result set expressions appear directly below with
in a query, and define a set of either nodes or edges.
The set may be used to return results, as stepping stones to derive other sets from, or both.
Result set expressions are named and can be chained as we'll see examples of later.
A result set expression may also define sort
order and a limit
. See sorting for more
details.
While result set expressions may relate to each other via chaining, they don't have to. You can query for entirely unrelated things in the same query, but different sets are generally used to power graph traversals.
A set either queries nodes or it queries edges, possibly recursively.
All fields:
nodes
: An object to specify a result set of matching nodes.edges
: An object to specify a result set of matching edges.sort
: A list of sort configurationslimit
: How many nodes or edges to return in the result. Default: 100The max limit you can set for any table expression is 10,000. In order to support retrieving the entire result set, pagination cursors are emitted for each result set expression, allowing you to page through everything. Pagination cursors cannot be combined with custom sorts. If no pagination cursor is present for a given result set expression in the response, then there is guaranteed no more data that matches.
A nodes
statement in your result set expression will make the set contain nodes.
A node result set can be chained off both node and edge result set expressions. When chaining off another node result
set you will retrieve the nodes pointed to by a given direct relation property, this direct relation
property is defined using the through
field. When chaining off an edge result set, you will retrieve the end nodes
defined by the edges in the set.
from
: A different result set expression to chain from
through
: What property to join the from
"through". The through
property must be a direct relation.
chainTo
: Control which side of the edge to chain to. This option is only applicable if the view referenced in the from
field consists of edges. chainTo
can be one of:
source
will chain to start
if you're following edges outwards i.e direction=outwards
. If you're following edges inwards i.e direction=inwards
, it will chain to end
.
destination (default)
will chain to end
if you're following edges outwards i.e direction=outwards
. If you're following edges inwards i.e direction=inwards
, it will chain to start
.
filter
: A filter to determine which nodes to match and thus be returned in the respective result set.
An edges
statement in a result set expression will make the set contain edges, and the statement defines the rules
the graph traversal will follow.
A graph traversal can start from some initial set. This can be defined by from
, which will name another result
set expression.
The graph traversal follows edges in a particular direction, controlled by direction
, which defaults to outwards
.
Alice -is_parent-> Bob
Bob -fancies-> Mallory
Given the above graph, if you follow any edge from Bob
outwards (which is default), you'll get the edges
Bob -fancies-> Mallory
. If you follow edges inwards, i.e. direction=inwards
, you'll get Alice -is-parent-> Bob
.
The traversal happens breadth first. See limitations for more details.
A traversal is defined by what edges to follow, what nodes to match, and what nodes to terminate traversal at.
This is controlled by filter
, nodeFilter
and terminationFilter
.
filter
is a filter on edges. You would typically filter on the property [edge, type]
, but any property on an edge
can be filtered on.
nodeFilter
is a node filter, which the node on the "other" side must match. With direction: outwards
, that means
the "end node" of the edge must match. With direction: inwards
, the "start node" must match.
terminationFilter
is similar to nodeFilter
, except if it matches, traversal will end. A node must also match
nodeFilter
(if any) to steer the traversal to the node to terminate at in the first place.
maxDistance
controls how many hops away from the initial set traversal will go. maxDistance
defaults to unlimited
(but the set must respect its limit
, defined on the result set expression). If maxDistance
is 1, execution might
be faster, so if you know there will only be one level, it's worth configuring maxDistance: 1
.
Full options:
from
: Result set expression to chain from.filter
: Edges traversed must match this filter.nodeFilter
: Nodes on the "other" side of the edge must match this filter.terminationFilter
. Do not traverse beyond nodes matching this filter.maxDistance
: How many levels to traverse. Default unlimited.direction
: Whether to traverse edges pointing out of the initial set, or into the initial set.limitEach
: Limit the number of returned edges for each of the source nodes in the result set. The indicated uniform
limit applies to the result set from the referenced from. limitEach only has meaning when you also specify
maxDistance=1 and from.Select configurations appear directly below select
in a query. These specify which data to retrieve for the respective
result set expression. It specifies a number of sources (views) and a property selector for each of these. The property
selectors define which view properties will be emitted in the query result.
It's possible to have sets whose properties are not emitted. This can be useful if the sets are necessary for chaining, but not actually interesting to include in the final results. Sets that are neither chained nor selected will not be executed (but will cause very slight query processing overhead)
Results are grouped by their respective sets, and the results contain properties that match the property selectors for the set.
Filters define what a part of the query matches. Filters are tree structures where the operator comes first, and then the parameters for that operator.
A simple example is the in
filter:
in:
property: [node, name]
values: [movie]
If the property node.name
, which is text property, is equal to any of the values in the provided list, the node will
match. Properties are typed. What query operators you can use on a property depends on its type.
An exhaustive list of filters and their descriptions can be found by examining the request body schema below.
Filters can be combined with and
/or
/not
:
and:
- not:
in:
property: [node, type]
values: [movie]
- range:
property: [imdb, movie, released]
gte: {parameter: start}
This would correspond to (NOT node.type in ('movie')) AND imdb.movie.released >= $start
.
Although this filter is documentented in the request body schema, it merits a more detailed explanation.
A hasData
filters will match if data is present in a given set of containers or views.
There is an implicit AND
between the containers and views referenced in the filter, so the filter will match
if and only if the node/edge has data in all of the specified containers and views.
When a container is specified, the filter will match if the instance has all required properties populated for that particular container.
When a view is specified, the filter will match nodes with data in all of the containers which the view references through properties, respecting the filters of the view if defined (and the filters of views implemented by the view).
Example:
hasData:
- type: container
space: my_space
externalId: my_container
- type: view
space: my_space
externalId: my_view
version: v1
If my_space.my_view.v1
maps properties in the containers my_space.c1
and my_space.c2
. The filter will match
if there is data in my_space.my_container AND (my_space.c1 AND my_space.c2)
if there is no filter defined on
my_space.my_view.v1
, and my_space.my_container AND my_space.my_view.v1.filter
if there is a filter defined on
my_space.my_view.v1
.
Values in filters can be parameterised. Parameters are provided as part of the query object, and not in the filter itself.
This filter is parameterised:
range:
property: [imdb, movie, released]
gte: {parameter: start}
A query containing this filter will only run if the parameter start
is provided. The parameter must be compatible
with all the types and operators that refer to the parameter. In the above example, the "released" property is a date.
Thus, the start parameter must be compatible with the date type, or the query will fail completely, even if the range
filter is optional because it's OR-ed
TIP
Parameterise your filters!
It's a best practice to parameterise queries that take user input. This enables reusing query plans across queries, which will be noticable with read-heavy workloads.
Sorting and limiting can happen in multiple places in a query:
with
object that defines a node or edge set.select
where defined sets can be emitted as results.Sorting and limiting the set definitions under with
will transitively affect dependent sets.
Changes to a set defined under with
will naturally affect sets that depend on it, transitively. If you only change
the sort order of a with
expression, dependent sets will not (necessarily) change (based on how the dependent sets
are defined), but if you put a limit on an expression, all dependent sets will inherit this and change as a consequence.
This is also true for sets that aren't actually emitted via select
, i.e. sets that are only defined as stepping-stones
for other sets.
Sorts and limits defined under select
changes the result appearing order for that set only, and not for depending
sets.
Example:
with:
some_nodes:
… # omitted. No sort here
some_edges:
from: some_nodes
# omitted. also no sorting
target_nodes:
from: some_edges
# …
select:
some_nodes:
properties: ...
sort:
- {property: [node, created], direction: descending}
limit: 100
The above query would still let some_edges
and target_nodes
pull from the full amount of nodes in some_nodes
,
even though what's returned as a result for some_nodes
is capped at 100.
NOTE
A limit in an edge traversal applies to when to start traversing, which happens before sorting.
Nodes and edges have subtly different sorting and limiting behaviour: Nodes sort and limit simultaneously, while recursive edge exploration do limited traversal, then sort.
The top-n set of nodes sorted by some property will be guaranteed to have the top-n of that property for the set.
For edges found through traversal, i.e. via edges
, the limit
applies to how many edges to discover. This may not
be all the edges that could have been discovered in a full traversal. If you start traversing from some node and ask
for 100 edges sorted by creation timestamp, the 100 edges discovered before traversal stops get sorted. The full graph
is not traversed in order to find the 100 newest edges that exist in the graph defined by the traversal filters.
Therefore, to do sorting with a recursive graph traversal, you'll need to specify the sort configuration via postSort
.
An edge traversal with maxDistance=1
can take a normal sort
configuration, however.
Any query that involves a graph traversal will force nested loop-style execution. This will work well enough for traversals limited to a few hundred thousand unique paths.
The graph traversal is breadth first. All possible paths are traversed. This is important to keep in mind with traversals across loops. For example, a fully connected graph will not do well in a query that follows all possible paths, and is very likely to be terminated due to constraints on either time, memory, or temporary disk usage.
Queries get cancelled with a 408 Request Timeout error if they take longer than the timeout. If hitting a timeout like this you will need to reduce load or contention, or optimise your query.
Query specification.
required | object [ 1 .. 20 ] properties |
object <= 20 properties Cursors returned from the previous query request. These cursors match the result set expressions you specified in the | |
required | object [ 1 .. 20 ] properties |
object Values in filters can be parameterised. Parameters are provided as part of the query object, and referenced in the filter itself. |
{- "with": {
- "result-expression-name1": {
- "sort": [
- {
- "property": [
- "string"
], - "direction": "ascending",
- "nullsFirst": false
}
], - "limit": 0,
- "nodes": {
- "from": "string",
- "chainTo": "source",
- "through": {
- "view": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "identifier": "string"
}, - "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}
}
}, - "result-expression-name2": {
- "sort": [
- {
- "property": [
- "string"
], - "direction": "ascending",
- "nullsFirst": false
}
], - "limit": 0,
- "nodes": {
- "from": "string",
- "chainTo": "source",
- "through": {
- "view": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "identifier": "string"
}, - "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}
}
}
}, - "cursors": {
- "pagination cursor reference1": "string",
- "pagination cursor reference2": "string"
}, - "select": {
- "result-expression-name1": {
- "sources": [
- {
- "source": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "properties": [
- "string"
]
}
], - "sort": [
- {
- "property": [
- "string"
], - "direction": "ascending",
- "nullsFirst": false
}
], - "limit": 0
}, - "result-expression-name2": {
- "sources": [
- {
- "source": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "properties": [
- "string"
]
}
], - "sort": [
- {
- "property": [
- "string"
], - "direction": "ascending",
- "nullsFirst": false
}
], - "limit": 0
}
}, - "parameters": {
- "parameter-identifier1": "string",
- "parameter-identifier2": "string"
}
}
{- "items": {
- "result-expression1": [
- {
- "instanceType": "node",
- "version": 0,
- "space": "string",
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "deletedTime": 0,
- "properties": {
- "space-name1": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}, - "space-name2": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}
}
}
], - "result-expression2": [
- {
- "instanceType": "node",
- "version": 0,
- "space": "string",
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "deletedTime": 0,
- "properties": {
- "space-name1": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}, - "space-name2": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}
}
}
]
}, - "nextCursor": {
- "cursor-name1": "string",
- "cursor-name2": "string"
}
}
Retrieve up to 1000 nodes or edges by their external ids.
List of external-ids for nodes or edges to retrieve. Properties for up to 10 unique views (in total across the external ids requested) can be retrieved in one query.
Array of objects (SourceSelectorWithoutPropertiesV3) <= 10 items Retrieve properties from the listed - by reference - views. | |
required | Array of objects [ 1 .. 1000 ] items |
includeTyping | boolean (IncludeTyping) Default: false Should we return property type information as part of the result? |
{- "sources": [
- {
- "source": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}
}
], - "items": [
- {
- "instanceType": "node",
- "externalId": "string",
- "space": "string"
}
], - "includeTyping": false
}
{- "items": [
- {
- "instanceType": "node",
- "version": 0,
- "space": "string",
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "deletedTime": 0,
- "properties": {
- "space-name1": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}, - "space-name2": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}
}
}
], - "typing": {
- "space-name1": {
- "view-or-container-external-id1": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "view-or-container-external-id2": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}
}, - "space-name2": {
- "view-or-container-external-id1": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "view-or-container-external-id2": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}
}
}
}
Search text fields in views for nodes or edge(s). The service will return up to 1000 results. This operation orders the results by relevance, across the specified spaces.
The search specification.
required | object (ViewReference) Reference to a view |
query | string Query string that will be parsed and used for search. |
instanceType | string Default: "node" Enum: "node" "edge" Limit the search query to searching nodes or edges. Unless you set the item type to apply the search to, the service will default to searching nodes within the view. |
properties | Array of strings Optional array of properties you want to search through. If you do not specify one or more properties, the service will search all text fields within the view. |
(BoolFilter (and (object) or or (object) or not (object))) or (LeafFilter (equals (object) or in (object) or range (object) or prefix (object) or exists (object) or containsAny (object) or containsAll (object) or matchAll (object) or nested (object) or overlaps (object) or hasData (object))) (FilterDefinition) A filter Domain Specific Language (DSL) used to create advanced filter queries. | |
limit | integer [ 1 .. 1000 ] Default: 1000 Limits the number of results to return. |
{- "view": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "query": "string",
- "instanceType": "node",
- "properties": [
- "string"
], - "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}, - "limit": 1000
}
{- "items": [
- {
- "instanceType": "node",
- "version": 0,
- "space": "string",
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "deletedTime": 0,
- "properties": {
- "space-name1": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}, - "space-name2": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}
}
}
], - "typing": {
- "space-name1": {
- "view-or-container-external-id1": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "view-or-container-external-id2": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}
}, - "space-name2": {
- "view-or-container-external-id1": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}, - "view-or-container-external-id2": {
- "property-identifier1": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}, - "property-identifier2": {
- "nullable": true,
- "autoIncrement": false,
- "defaultValue": "string",
- "description": "string",
- "name": "string",
- "type": {
- "type": "text",
- "list": false,
- "collation": "ucs_basic"
}
}
}
}
}
}
Subscribe to changes for nodes and edges in a project, matching a supplied filter. This endpoint will always return a NextCursor
. The sync specification mirrors the query interface, but sorting is not currently supported.
Change filter specification
required | object [ 1 .. 20 ] properties |
object <= 20 properties Cursors returned from the previous sync request. These cursors match the result set expressions you specified in the | |
required | object [ 1 .. 20 ] properties |
object Parameters to return |
{- "with": {
- "property1": {
- "limit": 0,
- "nodes": {
- "from": "string",
- "chainTo": "source",
- "through": {
- "view": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "identifier": "string"
}, - "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}
}
}, - "property2": {
- "limit": 0,
- "nodes": {
- "from": "string",
- "chainTo": "source",
- "through": {
- "view": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "identifier": "string"
}, - "filter": {
- "and": [
- {
- "in": {
- "property": [
- "tag"
], - "values": [
- 10011,
- 10011
]
}
}, - {
- "range": {
- "property": [
- "weight"
], - "gte": 0
}
}
]
}
}
}
}, - "cursors": {
- "sync cursor reference1": "string",
- "sync cursor reference2": "string"
}, - "select": {
- "property1": {
- "sources": [
- {
- "source": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "properties": [
- "string"
]
}
]
}, - "property2": {
- "sources": [
- {
- "source": {
- "type": "view",
- "space": "string",
- "externalId": "string",
- "version": "string"
}, - "properties": [
- "string"
]
}
]
}
}, - "parameters": {
- "parameter-identifier1": "string",
- "parameter-identifier2": "string"
}
}
{- "items": {
- "result-expression1": [
- {
- "instanceType": "node",
- "version": 0,
- "space": "string",
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "deletedTime": 0,
- "properties": {
- "space-name1": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}, - "space-name2": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}
}
}
], - "result-expression2": [
- {
- "instanceType": "node",
- "version": 0,
- "space": "string",
- "externalId": "string",
- "createdTime": 0,
- "lastUpdatedTime": 0,
- "deletedTime": 0,
- "properties": {
- "space-name1": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}, - "space-name2": {
- "view-or-container-identifier1": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}, - "view-or-container-identifier2": {
- "property-identifier1": "string",
- "property-identifier2": "string"
}
}
}
}
]
}, - "nextCursor": {
- "cursor-name1": "string",
- "cursor-name2": "string"
}
}
required | Array of objects = 1 items |
{- "items": [
- {
- "externalId": "string",
- "description": "string"
}
]
}
{- "items": [
- {
- "externalId": "string",
- "description": "string",
- "createdTime": 0
}
]
}
ignoreUnknownIds | boolean Default: false If true, ignore unknown workflow ids. If false, a 404 is returned if any of the workflow ids are unknown, in which case, none of the workflows are deleted. |
Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "externalId": "string"
}
]
}
{ }
externalId required | string (WorkflowExternalId) <= 255 characters Identifier for a Workflow. Must be unique for the project. |
{- "externalId": "string",
- "description": "string",
- "createdTime": 0
}
Array of objects = 1 items |
{- "items": [
- {
- "workflowExternalId": "string",
- "version": "string",
- "workflowDefinition": {
- "description": "string",
- "tasks": [
- {
- "externalId": "my.known.id",
- "type": "function",
- "name": "string",
- "description": "string",
- "parameters": {
- "function": {
- "externalId": "my.known.id",
- "data": {
- "key1": "value1",
- "key2": "value2"
}
}, - "isAsyncComplete": false
}, - "retries": 3,
- "timeout": 3600,
- "dependsOn": [
- {
- "externalId": "my.known.id"
}
]
}
]
}
}
]
}
{- "items": [
- {
- "workflowExternalId": "string",
- "version": "string",
- "workflowDefinition": {
- "hash": "string",
- "description": "string",
- "tasks": [
- {
- "externalId": "my.known.id",
- "type": "function",
- "name": "string",
- "description": "string",
- "parameters": {
- "function": {
- "externalId": "my.known.id",
- "data": {
- "key1": "value1",
- "key2": "value2"
}
}, - "isAsyncComplete": false
}, - "retries": 3,
- "timeout": 3600,
- "dependsOn": [
- {
- "externalId": "my.known.id"
}
]
}
]
}
}
]
}
ignoreUnknownIds | boolean Default: false If true, ignore unknown version ids. If false, a 404 is returned if any of the versions ids are unknown, in which case, none of the versions are deleted. |
Array of objects [ 1 .. 100 ] items |
{- "items": [
- {
- "version": "string",
- "workflowExternalId": "string"
}
]
}
{ }
object (ListVersionsFilter) |
{- "filter": {
- "workflowFilters": [
- {
- "externalId": "string",
- "version": "string"
}
]
}
}
{- "items": [
- {
- "workflowExternalId": "string",
- "version": "string",
- "workflowDefinition": {
- "hash": "string",
- "description": "string",
- "tasks": [
- {
- "externalId": "my.known.id",
- "type": "function",
- "name": "string",
- "description": "string",
- "parameters": {
- "function": {
- "externalId": "my.known.id",
- "data": {
- "key1": "value1",
- "key2": "value2"
}
}, - "isAsyncComplete": false
}, - "retries": 3,
- "timeout": 3600,
- "dependsOn": [
- {
- "externalId": "my.known.id"
}
]
}
]
}
}
]
}
externalId required | string (WorkflowExternalId) <= 255 characters Identifier for a Workflow. Must be unique for the project. |
version required | string (Version) <= 255 characters Identifier for a Version. Must be unique for the Workflow. |
{- "workflowExternalId": "string",
- "version": "string",
- "workflowDefinition": {
- "hash": "string",
- "description": "string",
- "tasks": [
- {
- "externalId": "my.known.id",
- "type": "function",
- "name": "string",
- "description": "string",
- "parameters": {
- "function": {
- "externalId": "my.known.id",
- "data": {
- "key1": "value1",
- "key2": "value2"
}
}, - "isAsyncComplete": false
}, - "retries": 3,
- "timeout": 3600,
- "dependsOn": [
- {
- "externalId": "my.known.id"
}
]
}
]
}
}
executionId required | string |
{- "id": "string",
- "workflowExternalId": "string",
- "workflowDefinition": {
- "hash": "string",
- "description": "string",
- "tasks": [
- {
- "externalId": "my.known.id",
- "type": "function",
- "name": "string",
- "description": "string",
- "parameters": {
- "function": {
- "externalId": "my.known.id",
- "data": {
- "key1": "value1",
- "key2": "value2"
}
}, - "isAsyncComplete": false
}, - "retries": 3,
- "timeout": 3600,
- "dependsOn": [
- {
- "externalId": "my.known.id"
}
]
}
]
}, - "version": "string",
- "status": "RUNNING",
- "engineExecutionId": "string",
- "executedTasks": [
- {
- "id": "string",
- "externalId": "my.known.id",
- "status": "IN_PROGRESS",
- "taskType": "function",
- "startTime": 0,
- "endTime": 0,
- "input": {
- "key1": "value1",
- "key2": "value2"
}, - "output": {
- "callId": 0,
- "functionId": 0,
- "response": { }
}, - "reasonForIncompletion": "string"
}
], - "input": {
- "key1": "value1",
- "key2": "value2"
}, - "createdTime": 1683792343592,
- "startTime": 1683792343592,
- "endTime": 1683792343592,
- "reasonForIncompletion": "string"
}
limit | integer [ 1 .. 1000 ] Default: 1000 Maximum number of results to return. |
cursor | string Example: cursor=4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo Cursor to use for paging through results. This cursor is returned from a previous request. If not specified, start from the first page of results. |
object (List Executions Filter) |
{- "filter": {
- "workflowFilters": [
- {
- "externalId": "string",
- "version": "string"
}
], - "createdTimeStart": 0,
- "createdTimeEnd": 0
}
}
{- "items": [
- {
- "id": "string",
- "workflowExternalId": "string",
- "version": "string",
- "status": "RUNNING",
- "engineExecutionId": "string",
- "createdTime": 0,
- "startTime": 0,
- "endTime": 0,
- "reasonForIncompletion": "string"
}
], - "nextCursor": {
- "cursor": "4zj0Vy2fo0NtNMb229mI9r1V3YG5NBL752kQz1cKtwo"
}
}
externalId required | string (WorkflowExternalId) <= 255 characters Identifier for a Workflow. Must be unique for the project. |
version required | string (Version) <= 255 characters Identifier for a Version. Must be unique for the Workflow. |
required | object |
input | object |
{- "authentication": {
- "nonce": "string"
}, - "input": {
- "key1": "value1",
- "key2": "value2"
}
}
{- "id": "string",
- "workflowExternalId": "string",
- "version": "string",
- "status": "RUNNING",
- "engineExecutionId": "string",
- "createdTime": 0,
- "startTime": 0,
- "endTime": 0,
- "reasonForIncompletion": "string"
}
taskId required | string |
status required | string Enum: "COMPLETED" "FAILED" |
output | object |
{- "status": "COMPLETED",
- "output": {
- "key1": "value1",
- "key2": "value2"
}
}
{- "id": "string",
- "externalId": "my.known.id",
- "status": "IN_PROGRESS",
- "taskType": "function",
- "startTime": 0,
- "endTime": 0,
- "input": {
- "key1": "value1",
- "key2": "value2"
}, - "output": {
- "callId": 0,
- "functionId": 0,
- "response": { }
}, - "reasonForIncompletion": "string"
}