Skip to content

Uploading Agent Data

If an agent needs to upload data to MindSphere, MindSphere needs additional configuration to know how to interpret the agent's data stream. This configuration requires the following definitions within MindSphere:

  • Data Source Provisioning
  • Property Set Provisioning
  • Mapping for Data Source and Property Set

Creating a Data Source Configuration

A Data Source is a logical group that holds so called Data Points. Data Points hold metadata about a specific metric that the agent generates or measures.

For example, if an agent measures ambient temperature and pressure data, each of this two measurements need to be defined as a separate Data Point:

  • Data Point 1: Temperature measurement
  • Data Point 2: Pressure measurement

A Data Source on the other hand is an encapsulating/grouping object for the Data Points.

Data Points are set by a Data Source Configuration. MindSphere provides the dataSourceConfiguration endpoint of the Agent Management Service to create Data Points and the Data Source they are grouped into. A sample configuration request is provided below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
PUT /api/agentmanagement/v3/agents/{{agent_id}}/dataSourceConfiguration HTTP/1.1
Content-Type: application/json
If-Match: etag
Authorization: Bearer eyx...
{
    "configurationId": "string",
    "dataSources": [
        {
        "name": "string",
        "description": "string",
        "dataPoints": [
            {
            "id": "string",
            "name": "string",
            "description": "string",
            "type": "int",
            "unit": "string",
            "customData": {}
            }
        ],
        "customData": {}
        }
    ]
}
Parameter Description Remarks
dataSource.name Name of the Data Source Mandatory. Any string is allowed.
dataSource.description Description of the Data Source Optional. Any string is allowed.
dataPoint.id Data Point Id Mandatory. Must be unique per agent. No duplicates are allowed
dataPoint.name Name of the Data Point Mandatory. Any string is allowed, e.g. pressure, voltage, current, etc.
dataPoint.description Description of the Data Point Optional. Any string is allowed.
dataPoint.type Data type of the Data Point.

Can be a predefined type. By default, MindSphere provides the following base types: Double, Long, Int, String, Boolean
Mandatory
dataPoint.unit Unit of the Data Point Mandatory. Any string is allowed, e.g. percent, SI, etc. Empty strings are allowed.
dataPoint.customData Custom data, if any will be provided Optional
dataSources.customData Custom data, if any will be provided Optional

If successful, MindSphere returns an HTTP response 200 OK with a JSON body that holds the created Data Source Configuration:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
{
    "id": "string",
    "eTag": "2",
    "configurationId": "string",
    "dataSources": [
        {
        "name": "string",
        "description": "string",
        "dataPoints": [
            {
            "id": "string",
            "name": "string",
            "description": "string",
            "type": "int",
            "unit": "string",
            "customData": {}
            }
        ],
        "customData": {}
        }
    ]
}

Warning

When an existing Data Source Configuration is updated, all Data Point Mappings of the agent are deleted.

Note

For consuming exchange services the parameter configurationId needs to be provided to MindSphere.

Creating a Data Point Mapping

Before creating a Data Point Mapping we have already defined a Data Source (grouping object for the Data Points) and a Property Set (grouping object for the properties) in MindSphere.

A Data Point Mapping needs to be defined for MindSphere to interpret the data flowing from the agent to MindSphere. The Data Source Configuration holds metadata about the agent side where a Property Set holds metadata about the IoT side. Finally, MindSphere needs information to map from Data Point meta to Property meta. This configuration is called Data Point Mapping and defines a mapping for each data point to a property.

In order to create a mapping, MindSphere provides the dataPointMappings endpoint of the MindConnect API, which needs to be used with following sample mapping parameters:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
POST /api/mindconnect/v3/dataPointMappings HTTP/1.1
Content-Type: Application/Json
Authorization: Bearer eyc..
{
    "agentId": "11961bc396cd4a87a9b26b723f5b7ba0",
    "dataPointId": "DP0001",
    "entityId": "83e78008eadf453bae4f5c7bef3db550",
    "propertySetName": "ElectricalProperties",
    "propertyName": "Voltage"
}
Parameter Description Trouble Shooting
agentId String, validated, null-checked. Exists in Agent Management. HTTP 400: Agent is empty!

HTTP 404: AgentId, {...} in the data Point mapping does not exist!
dataPointId String, validated, null-checked. Exists in Agent Management, should belong to Data Source Configuration of given agentId. HTTP 400: DataPointId is empty!

HTTP 404: DataPointId, {...} in the data Point mapping does not exist in the related agent!
entityId String, validated, null-checked. Exists in IoT Entity Service. HTTP 400: Entity Id is empty!

HTTP 404: Not Found
propertySetName String, validated, null-checked. Exists in IoT Type Service, should belong to entity type of entityId. HTTP 400: PropertySetName is empty!

HTTP 404: Property Set not found with name {...}

HTTP 404: Entity Type does not own a Property Set with name {...}
propertyName String, validated. Exists in IoT Type Service, should belong to the Property Set. HTTP 400: PropertyName is empty!

HTTP 404: Property set does not own a Property with name {...}

Also, the following aspects have to be considered:

  • Unit and type of data point and property should match.
  • If property unit is null, data point unit should be empty string.
  • Property category cannot be static.
  • Data Point cannot be mapped to different properties of the same property name.

Before creating a Data Point Mapping, MindSphere is executing the following checks:

Create Data Point Mapping workflow and checks

If successful, MindSphere returns an HTTP responds 201 Created with a JSON body that holds the created mapping configuration. It only maps a single data point to a single property. If more mapping is needed, this step must be repeated.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
{
    "id": "4fad6258-5def-4d84-a4c2-1481b209c116",
    "agentId": "11961bc396cd4a87a9b26b723f5b7ba0",
    "dataPointId": "DP0001",
    "dataPointUnit": "%",
    "dataPointType": "INT",
    "entityId": "83e78008eadf453bae4f5c7bef3db550",
    "propertySetName": "ElectricalProperties",
    "propertyName": "Voltage",
    "propertyUnit": "%",
    "propertyType": "INT",
    "qualityEnabled": true
}

Warning

When an existing Data Source Configuration is updated, all Data Point Mappings of the agent are deleted.

Consuming Exchange Services

The MindSphere exchange endpoint of the MindConnect API provides the agent with the capability of uploading its data to MindSphere. This data can be of type:

  • Time Series
  • File Upload
  • Event Upload

The format conforms to a subset of the HTTP multipart specification, but only permits nesting of 2 levels.

Below is a sample exchange message content:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
--f0Ve5iPP2ySppIcDSR6Bak
Content-Type: multipart/related;boundary=penFL6sBQHJJUN3HA4ftqC

--penFL6sBQHJJUN3HA4ftqC
Content-Type: application/vnd.siemens.mindsphere.meta+json
{
    "type": "item",
    "version": "1.0",
    "payload": {
        "type": "standardTimeSeries",
        "version": "1.0",
        "details": {
            "configurationId": "{{_configuration_id}}"
        }
    }
}
--penFL6sBQHJJUN3HA4ftqC
Content-Type: application/json
[
    {
        "timestamp": "2017-02-01T08:30:03.780Z",
        "values": [
            {
                "dataPointId": "{{_datapoint_id_1}}",
                "value": "9856",
                "qualityCode": "0"
            },
            {
                "dataPointId": "{{_datapoint_id_2}}",
                "value": "3766",
                "qualityCode": "0"
            }
        ]
    }
]
--penFL6sBQHJJUN3HA4ftqC--
--f0Ve5iPP2ySppIcDSR6Bak--

In the sample the multipart message starts with the --f0Ve5iPP2ySppIcDSR6Bak boundary. The double dash at the beginning indicates a boundary start, whereas the double dash boundary identifier at the beginning and the closing in --f0Ve5iPP2ySppIcDSR6Bak-- indicates the end of the message.

The multipart messages occur within the initial boundary. Each multipart message consists of metadata and a payload, therefore each message contains two boundary start identifiers and in the end a single boundary end:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
--initial boundary

--boundary1
Metadata

--boundary1
Payload

--boundary1--

--initial boundary--

Within MindSphere metadata and payload couples are referred as tuples. Each tuple may contain different type of data (for example one tuple may contain a specific timestamp data, whereas the other may contain octet/stream data). The example below contains an exchange payload for an octet/stream mime type:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
--5c6d7e29ef6868d0eb73
Content-Type: application/vnd.siemens.mindsphere.meta+json
{
    "type": "item",
    "version": "1.0",
    "payload": {
        "type": "file",
        "version": "1.0",
        "details": {
            "fileName":"data_collector.log.old",
            "creationDate":"2017-02-10T13:00:00.000Z",
            "fileType":"log"
        }
    }
}

--5c6d7e29ef6868d0eb73
Content-Type: application/octet-stream

  ... File content here ...

--5c6d7e29ef6868d0eb73--
--f0Ve5iPP2ySppIcDSR6Bak--

In order to upload data (e.g. time series, files or events), the agent must use the exchange endpoint of the MindConnect API. Below is a sample request:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
POST {{_gateway_url}}/api/mindconnect/v3/exchange
Content-Type: multipart/mixed; boundary=f0Ve5iPP2ySppIcDSR6Bak
Authorization: Bearer access_token ...

--f0Ve5iPP2ySppIcDSR6Bak
Content-Type: multipart/related;boundary=penFL6sBQHJJUN3HA4ftqC

--penFL6sBQHJJUN3HA4ftqC
Content-Type: application/vnd.siemens.mindsphere.meta+json
{
    "type": "item",
    "version": "1.0",
    "payload": {
        "type": "standardTimeSeries",
        "version": "1.0",
        "details": {
            "configurationId": "{{_configuration_id}}"
        }
    }
}

--penFL6sBQHJJUN3HA4ftqC
Content-Type: application/json
[
    {
        "timestamp": "2017-02-01T08:30:03.780Z",
        "values": [
            {
                "dataPointId": "{{_datapoint_id_1}}",
                "value": "9856",
                "qualityCode": "0"
            },
            {
                "dataPointId": "{{_datapoint_id_2}}",
                "value": "3766",
                "qualityCode": "0"
            }
        ]
    }
]
--penFL6sBQHJJUN3HA4ftqC--
--f0Ve5iPP2ySppIcDSR6Bak--

Note

The property configurationId in the metadata part of the multipart message tells MindSphere how to interpret the data it receives (see section Creating a Data Source Configuration).

Agents can upload business events of an event type which is derived from type AgentBaseEvent. Here is a sample exchange request for event uploading:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
--FRaqbC9wSA2XvpFVjCRGry
Content-Type: multipart/related;boundary=5c6d7e29ef6868d0eb73

--5c6d7e29ef6868d0eb73
Content-Type: application/vnd.siemens.mindsphere.meta+json

{
    "type": "item",
    "version": "1.0",
    "payload": {
      "type": "businessEvent",
      "version": "1.0"
    }
}
--5c6d7e29ef6868d0eb73
Content-Type: application/json

[
    {
        "id": "7ba7b810-9dad-11d1-80b4-00c04fd430c8",
        "correlationId": "fd7fb194-cd73-4a54-9e53-97aca7bc8568",
        "timestamp": "2018-07-11T11:06:25.317Z",
        "severity": 2,
        "type": "{{eventType_from_AgentBaseEvent}}",
        "description": "file uploaded event",
        "version": "1.0",
        "details": {
          "requiredStringField": "required",
          "booleanField": "true",
          "stringField": "string",
          "integerField": "10",
          "doubleField": 10.0
        }
    }
]
--5c6d7e29ef6868d0eb73--
--FRaqbC9wSA2XvpFVjCRGry--

Upon accepting structure of multipart request, MindSphere returns an HTTP response 200 OK with an empty body. MindSphere will validate and store data asynchronously. The agent can upload as long as its access token is valid.

Replaying an Exchange Request

If MindSphere cannot process data after successful upload via MindConnect API, this data is not simply dropped. Instead, MindConnect's Recovery Service stores the unprocessed data for 30 calendar days. For a list of the stored unprocessed data, use the recoverableRecords endpoint. Its response is a page of recoverable records as shown below, which can be filtered by the agentId, correlationId and dropReason fields using a JSON filter.

1
2
3
4
5
6
{
      "id": "4fad6258-5def-4d84-a4c2-1481b209c116",
      "correlationId": "7had568-5def-4d84-a4c2-1481b209c116",
      "agentId": "33238f98784711e8adc0fa7ae01bbebc",
      "dropReason": " <[Dropped] TimeSeries Data is dropped. Validation failed for reason <Pre-processing of <153000> data points ended with <60600> valid and <92400> dropped data points. Following issues found: Data point with no mapping count is <92400>, including <[variable101, variable102, variable103, variable104, variable105, variable106, variable107, variable108, variable109, variable110]>.>>"
}

To inspect a record's payload, the MindConnect API provides the /recoverableRecords/{id}/downloadLink endpoint. The response contains a URL, to which the data is uploaded by MindConnect API, e.g.:

1
2
GET /api/mindconnect/v3/recoverableRecords/{id}/downloadLink` HTTP/1.1
"https://bucketname-s3.eu-central-1.amazonaws.com/c9bcd-44ab-4cfa-a87e-d81e727d9af4?X"

If an invalid mapping or registration has caused the data not to be processed, the exchange request can be replayed after correcting the definition. Use the /recoverableRecords/{id}/replay endpoint of the MindConnect API to trigger a replay.

Note

If the data can successfully be processed after a replay, it is removed from the recovery system after 2 days.

Diagnosing Exchange Services

Data uploaded via MindConnect API can be diagnosed with the Diagnostic Service feature. In order to activate an agent for the Diagnostic Service, MindSphere provides the diagnosticActivations endpoint of the MindConnect API, which needs to be used with following parameters:

1
2
3
4
5
6
7
POST /api/mindconnect/v3/diagnosticActivations HTTP/1.1
Content-Type: Application/Json
Authorization: Bearer eyc..
{
    "agentId": "11961bc396cd4a87a9b26b723f5b7ba0",
    "status": "ACTIVE"
}
Parameter Description Trouble Shooting
agentId String, validated HTTP 400: Agent is empty!

HTTP 404: Agent id {...} exists!

Agent id {...} could not be enabled. Because agent limitation is {...} for tenant {...}!
status String, validated
Allowed inputs are: ACTIVE, INACTIVE, NULL. If NULL, value is set as "ACTIVE" by default.
HTTP 400: Custom statuses are not allowed. Input value must be one of: [ACTIVE, INACTIVE, NULL]

Note

The Diagnostic Service can be activated for up to 5 agents per tenant. Activations can be deleted by the diagnosticActivations/id endpoint.

A list of diagnostic activations is provided by the diagnosticActivations endpoint. Its response is a page of diagnostic activations as shown below, which can be filtered by the agentId and status fields using a JSON filter.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
{
  "content": [
    {
      "id": "0b2d1cdecc7611e7abc4cec278b6b50a",
      "agentId": "3b27818ea09a46b48c7eb3fbd878349f",
      "status": "ACTIVE"
    }
  ],
  "last": true,
  "totalPages": 1,
  "totalElements": 1,
  "numberOfElements": 1,
  "first": true,
  "sort": {},
  "size": 20,
  "number": 0
}

An activation can be updated or deleted using the diagnosticActivations/id endpoint:

1
2
3
4
PUT /api/mindconnect/v3/diagnosticActivations HTTP/1.1
{
  "status": "INACTIVE"
}
Parameter Description Trouble Shooting
status String, validated
Allowed inputs are: ACTIVE, INACTIVE
HTTP 400: Custom statuses are not allowed. Status value must be one of: [ACTIVE, INACTIVE]

Diagnostic data is listed by the diagnosticInformation endpoint. Its response is a page as shown below, which can be filtered by the agentId, correlationId,message,source,timestamp and severity fields using a JSON filter.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
{
  "content": [
    {
      "agentId": "3b27818ea09a46b48c7eb3fbd878349f",
      "correlationId": "3fcf2a5ecc7611e7abc4cec278b6b50a",
      "severity": "INFO",
      "message": "[Accepted] Exchange request arrived",
      "source": "TIMESERIES",
      "state": "ACCEPTED",
      "timestamp": "2018-08-27T16:40:11.235Z"
    },
    {
      "agentId": "78e73978d6284c75ac2b8b5c33ffff57",
      "correlationId": "39301469413183203248389064099640",
      "severity": "ERROR",
      "message": "[Dropped] Registration is missing. Data type is: notstandardTimeSeries"
      "source": "TIMESERIES",
      "state": "DROPPED",
      "timestamp": "2018-08-27T16:40:11.235Z"
    },
  ],
  "last": true,
  "totalPages": 1,
  "totalElements": 2,
  "numberOfElements": 2,
  "first": true,
  "sort": {},
  "size": 20,
  "number": 0
}

Any questions left?

Ask the community


Except where otherwise noted, content on this site is licensed under the MindSphere Development License Agreement.