Porovnávané verzie

Kľúč

  • Tento riadok sa pridal
  • Riadok je odstránený.
  • Formátovanie sa zmenilo.

...

Type of  I/O tagAddressDescription
I/O tags for reading data sent by the MQTT server through a PUBLISH message.
Note: values of I/O tags are set by the D2000 KOM process in the order IN_TOPIC, IN_DATA, and IN_ID. It is not necessary for the configuration to contain all three I/O tags.
TxtI
Kotva
in_topic
in_topic
IN_TOPIC
Topic (Topic) of received PUBLISH message.
TxtI
Kotva
in_data
in_data
IN_DATA
Data (Payload) of received PUBLISH message.
Ci
Kotva
in_id
in_id
IN_ID
Identifier of a packet (Packet Identifier) of PUBLISH message that depends on the level of validation (QoS).
For messages sent with QoS_0, the identifier is zero, for QoS_1 and QoS_2, it is a positive 16-bit number.
Note: if the MQTT server sends also messages with the QoS_0 level of validation and the ACK_ID I/O tag is configured, then we recommend activating the option New value when changing time in the Filter tab, so that repeated writing of the value 0 will cause a new value that differs only in a timestamp to be generated.
I/O tag to confirm the received data to the MQTT server.
Co
Kotva
ack_id
ack_id
ACK_ID
If an output I/O tag with ACK_ID address is defined, the D2000 KOM expects confirmation of the processing of each message by writing a copy of the value of the IN_ID tag. Only after, it sets values from the next received PUBLISH message (if it was received in the meantime) into the IN_TOPIC, IN_DATA, and IN_ID I/O tags (in this order). 
In the case of the QoS_0 level of confirmation, it is, therefore, necessary to repeatedly set the value of the I/O tag ACK_ID to 0. 
If the I/O tag ACK_ID does not exist, the values are written into the IN_TOPIC, IN_DATA, and IN_ID I/O tags immediately after the PUBLISH message is received and processed.
Note: for the messages received with the QoS_0 level of validation, no confirmation is sent to the MQTT server, only the values of the received PUBLISH message will be published. 
I/O tags for sending values to the MQTT server through a PUBLISH message.
Note: in order for the D2000 KOM process to send the PUBLISH messages to the MQTT server, both I/O tags must be defined within one station.
TxtO
Kotva
out_topic
out_topic
OUT_TOPIC
The topic of the PUBLISH message being sent.
TxtO

Kotva
out_value
out_value
OUT_VALUE

Data (Payload) of the PUBLISH message being sent.
Note: sending the message is performed out as a result of writing to the OUT_VALUE I/O tag (i.e. if the Topic does not change then it is sufficient to set the OUT_TOPIC point once - e.g. by using default value).
I/O tags for parsing JSON messages

TxtI, TxtO, Qi,
Ci, Co,
Ai, Ao,
Di, Do,
TiR, ToR

Kotva
ja
ja
JA=json_address

If Payload Type=JSON, the message is parsed as JSON data. The json_address value specifies the name of the JSON field whose value is to be assigned to the I/O tag
For JSON messages that can be structured, the syntax level1.level2.level3 ... is supported, e.g. rx.current, and if they contain fields (indexed from 1) also syntax level1[index1].level2[index2].level3 ... is possible, e.g. rx.gwrx[1].time.
Since the JSON message itself can be an array, the address can also start with an index, e.g. JA=[1].batt_cell_v_avg

For other examples, see the description of the LoRaWAN protocol's Envelope type I/O tags.

I/O tags for parsing Sparkplug messages

TxtI, TxtO, Qi,
Ci, Co,
Ai, Ao,
Di, Do,
TiR, ToR, TiA, ToA

Kotva
sa
sa
Input I/O tags:
SA=sparkplug_address

Output I/O tags:
ST=type;SA=sparkplug_address

If Payload Type=Sparkplug, the message is parsed as Sparkplug data (a binary format built on Google Protocol Buffers). Sparkplug data contains metrics that have text identifiers (sparkplug_address).

Reading template items  is possible by specifying sparkplug_address in the format <TemplateName1><Separator><TemplateName2><Separator> ... <Separator><ItemName> where:

  • <TemplateNameX> is the name of the template/sub-template
  • <Separator> is the separator for individual levels (standard characters "->", which can be changed with the Item Separator parameter if this sequence occurs in template/item names)
  • <ItemName> is the item name of the deepest nested template

Examples of template item addresses:
SA=Template1->SubTemplate2->Item
SA=secUDT→sec

Reading dataset items (equivalent to structured variables in D2000) is possible by specifying sparkplug_address in the format <DatasetName>[<Row>]^<ColumnName> where:

  • <DatasetName> is the name of the dataset (it can also be part of a structure, e.g. Template1->SubTemplate2->Dataset3)
  • <Row> is the column number (1..N) or the "*" character. In the latter case, it is possible to configure the Destination column to which all rows are written (values ​​from the first row of the corresponding column are written to the I/O tag)
  • <ColumnName> is the name of the dataset column

Examples of dataset item addresses:
SA=Performance[3]^ActivePower
SA=Machine2->Parameters[1]^ActivePower
SA=DHS/Formation Data->Reservoir Parameter[*]^Gas density


For output I/O tags, the value type must be specified. Simple types are supported (not template items/dataset items):

  • Int8
  • Int16
  • Int32
  • Int64
  • UInt8
  • UInt16
  • UInt32
  • UInt64
  • Float
  • Double
  • Boolean
  • String
  • DateTime
  • Text

The PUBLISH message created during writing contains a Topic derived from the station address. The message type depends on the station address - whether it is Edge Node (NCMD) or Device/Sensor (DCMD). The Payload contains a timestamp, a value type (type), a written value (encoded according to the specified value type), and a metric name (sparkplug_address).

TxtI

Kotva
IN_SP2JS
IN_SP2JS
IN_SP2JS

The I/O tag is used to convert the Sparkplug payload into a JSON representation, which can then be processed, e.g. in ESL script. Depending on the Convert Datatype/Timestamp to Text parameter, a textual representation of the value type and timestamp is also added. 
An example of value:

{"metrics":[{"datatype":3,"int_value":7338992,"name":"Corrected Vol Acc Stn","timestamp":1729664005479}],"seq":32,"timestamp":1729664005479}

After formatting into readable form:

{
    "metrics": [
        {
            "datatype": 3,
            "int_value": 7338992,
            "name": "Corrected Vol Acc Stn",
            "timestamp": 1729664005479
        }
    ],
    "seq": 32,
    "timestamp": 1729664005479
}

An example of a more complex value containing properties and a dataset and also displaying a textual representation of the data type (datatype_txt) and timestamp (timestamp_txt) as a result of the set parameter Convert Datatype/Timestamp to Text:

{
    "metrics": [
        {
            "datatype": 12,
            "datatype_txt": "String",
            "name": "Node Properties/Configuration",
            "string_value": "{}",
            "timestamp": 1730305529539,
            "timestamp_txt": "30-10-2024 17:25:29.539"
        },
        {
            "alias": 30064771073,
            "datatype": 5,
            "datatype_txt": "Uint8",
            "int_value": 0,
            "name": "Node Properties/Missing Param",
            "properties": {
                "keys": [
                    "usage"
                ],
                "values": [
                    {
                        "string_value": "technical information",
                        "type": 12,
                        "type_txt": "String"
                    }
                ]
            },
            "timestamp": 1730305529537,
            "timestamp_txt": "30-10-2024 17:25:29.537"
        },
        {
            "alias": 0,
            "dataset_value": {
                "columns": [
                    "topic_name",
                    "offset",
                    "length",
                    "crc"
                ],
                "num_of_columns": 4,
                "rows": [
                    {
                        "row": [
                            "N/A",
                            0,
                            0,
                            0
                        ]
                    }
                ],
                "types": [
                    12,
                    7,
                    7,
                    7
                ],
                "types_txt": [
                    "String",
                    "UInt32",
                    "UInt32",
                    "UInt32"
                ]
            },
            "datatype": 16,
            "datatype_txt": "DataSet",
            "name": "Node Control/FW Update",
            "timestamp": 1730305529537,
            "timestamp_txt": "30-10-2024 17:25:29.537"
        }
    ],
    "seq": 0,
    "timestamp": 1730305529536,
    "timestamp_txt": "30-10-2024 17:25:29.536"
}


Note: it is also possible to monitor the status of other Sparkplug Host Applications connected to the MQTT server. If the Identifier of Host Application is e.g. "ACME", so it is necessary to create a station with the address "spBv1.0/STATE/ACME" (or in abbreviated form "ACME") and on it an I/O tag of type Di with the address "JA=online" (since the Host Application sends a STATE message with a JSON payload).

...