Skip to content

Uploading agent data

If an agent needs to upload data to MindSphere, MindSphere needs additional configuration to know how to interpret the agent's data stream. This configuration requires the following definitions within MindSphere:

  • Data Source Provisioning
  • Property Set Provisioning
  • Mapping for Data Source and Property Set

Creating a Data Source Configuration

A Data Source is a logical group that holds so called Data Points. Data Points hold metadata about a specific metric that the agent generates or measures.

For example, if an agent measures ambient temperature and pressure data, each of this two measurements need to be defined as a separate Data Point:

  • Data Point 1: Temperature measurement
  • Data Point 2: Pressure measurement

A Data Source on the other hand is an encapsulating/grouping object for the Data Points.

Data Points are set by a Data Source Configuration. MindSphere provides the dataSourceConfiguration endpoint of the Agent Management Service to create Data Points and the Data Source they are grouped into. A sample configuration request is provided below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
PUT /api/agentmanagement/v3/agents/{{agent_id}}/dataSourceConfiguration HTTP/1.1
Content-Type: application/json
If-Match: etag
Authorization: Bearer eyx...
{
    "configurationId": "string",
    "dataSources": [
        {
        "name": "string",
        "description": "string",
        "dataPoints": [
            {
            "id": "string",
            "name": "string",
            "description": "string",
            "type": "int",
            "unit": "string",
            "customData": {}
            }
        ],
        "customData": {}
        }
    ]
}
Parameter Description Remarks
dataSource.name Name of the Data Source Mandatory. Any string is allowed.
dataSource.description Description of the Data Source Optional. Any string is allowed.
dataPoint.id Data Point Id Mandatory. Must be unique per agent. No duplicates are allowed
dataPoint.name Name of the Data Point Mandatory. Any string is allowed, e.g. pressure, voltage, current, etc.
dataPoint.description Description of the Data Point Optional. Any string is allowed.
dataPoint.type Data type of the Data Point.

Can be a predefined type. By default, MindSphere provides the following base types: Double, Long, Int, String, Boolean
Mandatory
dataPoint.unit Unit of the Data Point Mandatory. Any string is allowed, e.g. percent, SI, etc. Empty strings are allowed.
dataPoint.customData Custom data, if any will be provided Optional
dataSources.customData Custom data, if any will be provided Optional

If successful, MindSphere returns an HTTP response 200 OK with a JSON body that holds the created Data Source Configuration:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
{
    "id": "string",
    "eTag": "2",
    "configurationId": "string",
    "dataSources": [
        {
        "name": "string",
        "description": "string",
        "dataPoints": [
            {
            "id": "string",
            "name": "string",
            "description": "string",
            "type": "int",
            "unit": "string",
            "customData": {}
            }
        ],
        "customData": {}
        }
    ]
}

Warning

When an existing Data Source Configuration is updated, all Data Point Mappings of the agent are deleted.

Note

For consuming exchange services the parameter configurationId needs to be provided to MindSphere.

Creating a Data Point Mapping

Before creating a Data Point Mapping we have already defined a Data Source (grouping object for the Data Points) and a Property Set (grouping object for the properties) in MindSphere.

A Data Point Mapping needs to be defined for MindSphere to interpret the data flowing from the agent to MindSphere. The Data Source Configuration holds metadata about the agent side where a Property Set holds metadata about the IoT side. Finally, MindSphere needs information to map from Data Point meta to Property meta. This configuration is called Data Point Mapping and defines a mapping for each data point to a property.

In order to create a mapping, MindSphere provides the dataPointMappings endpoint of the MindConnect API, which needs to be used with following sample mapping parameters:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
POST /api/mindconnect/v3/dataPointMappings HTTP/1.1
Content-Type: Application/Json
Authorization: Bearer eyc..
{
    "agentId": "11961bc396cd4a87a9b26b723f5b7ba0",
    "dataPointId": "DP0001",
    "entityId": "83e78008eadf453bae4f5c7bef3db550",
    "propertySetName": "ElectricalProperties",
    "propertyName": "Voltage"
}
Parameter Description Validation Error Response
agentId String, validated, null-checked. Exists in Agent Management. Agent is empty! HTTP 400

AgentId, {...} in the data Point mapping does not exist! HTTP 404
dataPointId String, validated, null-checked. Exists in Agent Management, should belong to Data Source Configuration of given agentId. DataPointId is empty! HTTP 400

DataPointId, {...} in the data Point mapping does not exist in the related agent! HTTP 404
entityId String, validated, null-checked. Exists in IoT Entity Service. Entity Id is empty! HTTP 400

Not Found, HTTP 404
propertySetName String, validated, null-checked. Exists in IoT Type Service, should belong to entity type of entityId. PropertySetName is empty! HTTP 400

Property Set not found with name {...}, HTTP 404

Entity Type does not own a Property Set with name {...}, HTTP 404
propertyName String, validated. Exists in IoT Type Service, should belong to the Property Set. PropertyName is empty! HTTP 400

Property set does not own a Property with name {...}, HTTP 404

Also, the following aspects have to be considered:

  • Unit and type of data point and property should match.
  • If property unit is null, data point unit should be empty string.
  • Property category cannot be static.
  • Data Point cannot be mapped to different properties of the same property name.

Before creating a Data Point Mapping, MindSphere is executing the following checks:

Create Data Point Mapping workflow and checks

If successful, MindSphere returns an HTTP responds 201 Created with a JSON body that holds the created mapping configuration. It only maps a single data point to a single property. If more mapping is needed, this step must be repeated.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
{
    "id": "4fad6258-5def-4d84-a4c2-1481b209c116",
    "agentId": "11961bc396cd4a87a9b26b723f5b7ba0",
    "dataPointId": "DP0001",
    "dataPointUnit": "%",
    "dataPointType": "INT",
    "entityId": "83e78008eadf453bae4f5c7bef3db550",
    "propertySetName": "ElectricalProperties",
    "propertyName": "Voltage",
    "propertyUnit": "%",
    "propertyType": "INT",
    "qualityEnabled": true
}

Warning

When an existing Data Source Configuration is updated, all Data Point Mappings of the agent are deleted.

Consuming exchange services

The MindSphere exchange endpoint of the MindConnect API provides the agent with the capability of uploading its data to MindSphere. This data can be of type:

  • Time Series
  • File Upload
  • Event Upload

The format conforms to a subset of the HTTP multipart specification, but only permits nesting of 2 levels.

Below is a sample exchange message content:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
--f0Ve5iPP2ySppIcDSR6Bak
Content-Type: multipart/related;boundary=penFL6sBQHJJUN3HA4ftqC

--penFL6sBQHJJUN3HA4ftqC
Content-Type: application/vnd.siemens.mindsphere.meta+json
{
    "type": "item",
    "version": "1.0",
    "payload": {
        "type": "standardTimeSeries",
        "version": "1.0",
        "details": {
            "configurationId": "{{_configuration_id}}"
        }
    }
}
--penFL6sBQHJJUN3HA4ftqC
Content-Type: application/json
[
    {
        "timestamp": "2017-02-01T08:30:03.780Z",
        "values": [
            {
                "dataPointId": "{{_datapoint_id_1}}",
                "value": "9856",
                "qualityCode": "0"
            },
            {
                "dataPointId": "{{_datapoint_id_2}}",
                "value": "3766",
                "qualityCode": "0"
            }
        ]
    }
]
--penFL6sBQHJJUN3HA4ftqC--
--f0Ve5iPP2ySppIcDSR6Bak--

In the sample the multipart message starts with the --f0Ve5iPP2ySppIcDSR6Bak boundary. The double dash at the beginning indicates a boundary start, whereas the double dash boundary identifier at the beginning and the closing in --f0Ve5iPP2ySppIcDSR6Bak-- indicates the end of the message.

The multipart messages occur within the initial boundary. Each multipart message consists of metadata and a payload, therefore each message contains two boundary start identifiers and in the end a single boundary end:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
--initial boundary

--boundary1
Metadata

--boundary1
Payload

--boundary1--

--initial boundary--

Within MindSphere metadata and payload couples are referred as tuples. Each tuple may contain different type of data (for example one tuple may contain a specific timestamp data, whereas the other may contain octet/stream data). The example below contains an exchange payload for an octet/stream mime type:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
--5c6d7e29ef6868d0eb73
Content-Type: application/vnd.siemens.mindsphere.meta+json
{
    "type": "item",
    "version": "1.0",
    "payload": {
        "type": "file",
        "version": "1.0",
        "details": {
            "fileName":"data_collector.log.old",
            "creationDate":"2017-02-10T13:00:00.000Z",
            "fileType":"log"
        }
    }
}

--5c6d7e29ef6868d0eb73
Content-Type: application/octet-stream

  ... File content here ...

--5c6d7e29ef6868d0eb73--
--f0Ve5iPP2ySppIcDSR6Bak--

In order to upload data (e.g. time series, files or events), the agent must use the exchange endpoint of the MindConnect API. Below is a sample request:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
POST {{_gateway_url}}/api/mindconnect/v3/exchange
Content-Type: multipart/mixed; boundary=f0Ve5iPP2ySppIcDSR6Bak
Authorization: Bearer access_token ...

--f0Ve5iPP2ySppIcDSR6Bak
Content-Type: multipart/related;boundary=penFL6sBQHJJUN3HA4ftqC

--penFL6sBQHJJUN3HA4ftqC
Content-Type: application/vnd.siemens.mindsphere.meta+json
{
    "type": "item",
    "version": "1.0",
    "payload": {
        "type": "standardTimeSeries",
        "version": "1.0",
        "details": {
            "configurationId": "{{_configuration_id}}"
        }
    }
}

--penFL6sBQHJJUN3HA4ftqC
Content-Type: application/json
[
    {
        "timestamp": "2017-02-01T08:30:03.780Z",
        "values": [
            {
                "dataPointId": "{{_datapoint_id_1}}",
                "value": "9856",
                "qualityCode": "0"
            },
            {
                "dataPointId": "{{_datapoint_id_2}}",
                "value": "3766",
                "qualityCode": "0"
            }
        ]
    }
]
--penFL6sBQHJJUN3HA4ftqC--
--f0Ve5iPP2ySppIcDSR6Bak--

Note

The property configurationId in the metadata part of the multipart message tells MindSphere how to interpret the data it receives (see section Creating a Data Source Configuration).

Agents can upload business events of an event type which is derived from type AgentBaseEvent. Here is a sample exchange request for event uploading:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
--FRaqbC9wSA2XvpFVjCRGry
Content-Type: multipart/related;boundary=5c6d7e29ef6868d0eb73

--5c6d7e29ef6868d0eb73
Content-Type: application/vnd.siemens.mindsphere.meta+json

{
    "type": "item",
    "version": "1.0",
    "payload": {
      "type": "businessEvent",
      "version": "1.0"
    }
}
--5c6d7e29ef6868d0eb73
Content-Type: application/json

[
    {
        "id": "7ba7b810-9dad-11d1-80b4-00c04fd430c8",
        "correlationId": "fd7fb194-cd73-4a54-9e53-97aca7bc8568",
        "timestamp": "2018-07-11T11:06:25.317Z",
        "severity": 2,
        "type": "{{eventType_from_AgentBaseEvent}}",
        "description": "file uploaded event",
        "version": "1.0",
        "details": {
          "requiredStringField": "required",
          "booleanField": "true",
          "stringField": "string",
          "integerField": "10",
          "doubleField": 10.0
        }
    }
]
--5c6d7e29ef6868d0eb73--
--FRaqbC9wSA2XvpFVjCRGry--

Upon accepting structure of multipart request, MindSphere returns an HTTP response 200 OK with an empty body. MindSphere will validate and store data asynchronously. The agent can upload as long as its access token is valid.

Diagnosing exchange services

Data uploaded via MindConnect API can be diagnosed with the Diagnostic Service feature. In order to activate an agent for the Diagnostic Service, MindSphere provides the diagnosticActivations endpoint of the MindConnect API, which needs to be used with following parameters:

1
2
3
4
5
6
POST /api/mindconnect/v3/diagnosticActivations HTTP/1.1
Content-Type: Application/Json
Authorization: Bearer eyc..
{
    "agentId": "11961bc396cd4a87a9b26b723f5b7ba0"
}
Parameter Description Validation Error Response
agentId String, validated, null-checked Agent is empty! HTTP 400

Agent id {...} exists! HTTP 404

Agent id {...} could not be enabled. Because agent limitation is {...} for tenant {...}!

Note

Only 5 agents per tenant will be allowed for the Diagnostic Service. Activations can be deleted by the diagnosticActivations/id endpoint.

Diagnostic data can listed by the diagnosticInformation endpoint. The response can be filtered with a JSON filter including agentId, correlationId and severity fields.

Any questions left?

Ask the community


Except where otherwise noted, content on this site is licensed under the MindSphere Development License Agreement.