Porovnávané verzie

Kľúč

  • Tento riadok sa pridal
  • Riadok je odstránený.
  • Formátovanie sa zmenilo.

...

KeywordFull name

Meaning

UnitDefault value

Kotva
ssa
ssa
SSA

Subscribe Station Address

If this parameter has the value YES, the Topic related to the station address is also added to the SUBSCRIBE message:

  • If Payload Type = Sparkplug Host, the following topics are created from the station address:
    • NDATA/DDATA topic (e.g., spBv1.0/myGroup/NDATA/myEdgeNode or spBv1.0/myGroup/DDATA/myEdgeNode/myDevice1)
    • NBIRTH/DBIRTH topic (e.g., spBv1.0/myGroup/NBIRTH/myEdgeNode or spBv1.0/myGroup/DBIRTH/myEdgeNode/myDevice1)
    • NDEATH/DDEATH topic (e.g., spBv1.0/myGroup/NDEATH/myEdgeNode or spBv1.0/myGroup/DDEATH/myEdgeNode/myDevice1)
  • If Payload Type = Sparkplug Edge Node, the parameter is ignored - NCMD/DCMD topic is always created (e.g., spBv1.0/myGroup/NCMD/myEdgeNode or spBv1.0/myGroup/DCMD/myEdgeNode/myDevice1)
  • If Payload Type = Text only or JSON, the topic is the same as the station address (i.e., it makes no sense to set this parameter on stations that have addresses in the form of a regular expression, e.g., status/batt.*)
YES/NONO

Kotva
swt
swt
SWT

Station Will Topic

Will topic of the device. If this parameter is set and a message with the same topic is received, the station will go into a communication error (StHardErr) and the values of the I/O tags will be invalidated. In this way, it is possible to emulate the standard behavior that occurs when there is a communication error with the device (even if the communication between the D2000 Kom process and the MQTT broker is functional).
If Payload Type = Sparkplug Host, it is not necessary to enter this parameter - if an NDEATH/DDEATH message arrives with a Topic that matches the station address, the station will go into a communication error.



Kotva
swp
swp
SWP

Station Will Payload

Content of the Will message. If this parameter is set and a message with the same topic as defined by the Station Will Topic parameter is received, the Payload must also be the same. If this parameter is an empty string, matching the topic with the Station Will Topic parameter is sufficient.
Note: This parameter was implemented because MQTT brokers send messages with the same Topic when connecting/disconnecting the device, the difference being only in the payload.



Kotva
fpt
fpt
FPT

Payload Type

The setting of message parsing (overriding the line parameter Payload Type):

  • Default - the line parameter Payload Type is respected
  • Text only - the message is not parsed, it is assigned to the I/O tag with the address IN_TOPIC
  • JSON - the message is parsed as JSON data. If there is an I/O tag with the address IN_TOPIC, the whole message will be assigned to it.
    If there are I/O tags with addresses JA=json_address, they will be populated with the appropriate data from the JSON message. If no such addresses exist in the message, the I/O tags will be invalidated.
  • Sparkplug - the message is parsed as Sparkplug B payload (binary coded). Based on the line parameter Payload Type, this is Sparkplug Host/Edge Node mode, Sparkplug Host by default.
Default
Text only
JSON
Sparkplug
Default

Kotva
ftf
ftf
FTF

Time Field Name

If Payload Type = JSON, the name of the field with a timestamp - overriding the line parameter Time Field Name--

Kotva
ftm
ftm
FTM

Time Mask

Mask for parsing a value in the field with a timestamp - overriding the line parameter Time Mask.

Note: Whether the time is interpreted as local or UTC with a configured offset depends on the time station parameters settings.

--

Sparkplug parameters



Kotva
sr
sr
SR

Send Node Control/Rebirth

If Payload Type = Sparkplug Host, when connecting to the MQTT server, a command (NCMD or DCMD) with the metric 'Node Control/Rebirth' is sent to the SparkPlug station. The response should be a message (NBIRTH/DBIRTH) with all current metrics.

YES/NOYES

Kotva
sam
sam
SAM

Sparkplug Alias Mode

If Payload Type = Sparkplug Edge Node, the parameter specifies the alias usage mode. Aliases are numeric (integer - Int64) identifiers, used optionally in data and command messages (NDATA/DDATA/NCMD/DCMD) instead of text identifiers, due to the message size reduction. If used, they are listed in the NBIRTH/DBIRTH message together with text identifiers. Aliases must be unique within all I/O tags belonging to one station. 

  • Default (SAL=alias) - Aliases are used for those I/O tags that have them configured directly in the address as a SAL item (e.g. ST=UInt16;SAL=2;SA=Second).
  • Automatic (HOBJ) - Aliases are used for allI/O tags belonging to the station. The alias value is the HOBJ (ID) of the I/O tag. If an alias is configured in the I/O tag address, it is ignored.
  • Off - Aliases are not used. If they are configured directly in theI/O tag address, they are ignored.

Default (SAL=alias)

Automatic (HOBJ)

Off

Default (SAL=alias)


Kotva
merany_bod
merany_bod
I/O tag configuration

...

Type of  I/O tagAddressDescription
I/O tags for reading data sent by the MQTT server through a PUBLISH message.
Note: values of I/O tags are set by the D2000 KOM process in the order IN_TOPIC, IN_DATA, and IN_ID. It is not necessary for the configuration to contain all three I/O tags.
TxtI
Kotva
in_topic
in_topic
IN_TOPIC
Topic (Topic) of received PUBLISH message.
TxtI
Kotva
in_data
in_data
IN_DATA
Data (Payload) of received PUBLISH message.
Ci
Kotva
in_id
in_id
IN_ID
Identifier of a packet (Packet Identifier) of PUBLISH message that depends on the level of validation (QoS).
For messages sent with QoS_0, the identifier is zero, for QoS_1 and QoS_2, it is a positive 16-bit number.
Note: if the MQTT server sends also messages with the QoS_0 level of validation and the ACK_ID I/O tag is configured, then we recommend activating the option New value when changing time in the Filter tab, so that repeated writing of the value 0 will cause a new value that differs only in a timestamp to be generated.
I/O tag to confirm the received data to the MQTT server.
Co
Kotva
ack_id
ack_id
ACK_ID
If an output I/O tag with ACK_ID address is defined, the D2000 KOM expects confirmation of the processing of each message by writing a copy of the value of the IN_ID tag. Only after, it sets values from the next received PUBLISH message (if it was received in the meantime) into the IN_TOPIC, IN_DATA, and IN_ID I/O tags (in this order). 
In the case of the QoS_0 level of confirmation, it is, therefore, necessary to repeatedly set the value of the I/O tag ACK_ID to 0. 
If the I/O tag ACK_ID does not exist, the values are written into the IN_TOPIC, IN_DATA, and IN_ID I/O tags immediately after the PUBLISH message is received and processed.
Note: for the messages received with the QoS_0 level of validation, no confirmation is sent to the MQTT server, only the values of the received PUBLISH message will be published. 
I/O tags for sending values to the MQTT server through a PUBLISH message.
Note: in order for the D2000 KOM process to send the PUBLISH messages to the MQTT server, both I/O tags must be defined within one station.
TxtO
Kotva
out_topic
out_topic
OUT_TOPIC
The topic of the PUBLISH message being sent.
TxtO

Kotva
out_value
out_value
OUT_VALUE

Data (Payload) of the PUBLISH message being sent.
Note: sending the message is performed out as a result of writing to the OUT_VALUE I/O tag (i.e. if the Topic does not change then it is sufficient to set the OUT_TOPIC point once - e.g. by using default value).
I/O tags for parsing JSON messages

TxtI, TxtO, Qi,
Ci, Co,
Ai, Ao,
Di, Do,
TiR, ToR

Kotva
ja
ja
JA=json_address

If Payload Type=JSON, the message is parsed as JSON data. The json_address value specifies the name of the JSON field whose value is to be assigned to the I/O tag
For JSON messages that can be structured, the syntax level1.level2.level3 ... is supported, e.g. rx.current, and if they contain fields (indexed from 1) also syntax level1[index1].level2[index2].level3 ... is possible, e.g. rx.gwrx[1].time.
Since the JSON message itself can be an array, the address can also start with an index, e.g. JA=[1].batt_cell_v_avg

For other examples, see the description of the LoRaWAN protocol's Envelope type I/O tags.

I/O tags for parsing Sparkplug messages

TxtI, TxtO, Qi,
Ci, Co,
Ai, Ao,
Di, Do,
TiR, ToR, TiA, ToA

Kotva
sa
sa
Input I/O tags:
SA=sparkplug_address

Output I/O tags:
ST=type;SA=sparkplug_address

ST=type;SAL=alias;SA=sparkplug_address

If Payload Type = Sparkplug Host/Edge Node, the message is parsed as Sparkplug data (a binary format built on Google Protocol Buffers). Sparkplug data contains metrics that have text identifiers (sparkplug_address).

Reading template items  is possible by specifying sparkplug_address in the format <TemplateName1><Separator><TemplateName2><Separator> ... <Separator><ItemName> where:

  • <TemplateNameX> is the name of the template/sub-template
  • <Separator> is the separator for individual levels (standard characters "->", which can be changed with the Item Separator parameter if this sequence occurs in template/item names)
  • <ItemName> is the item name of the deepest nested template

Examples of template item addresses:
SA=Template1->SubTemplate2->Item
SA=secUDT->sec

Reading dataset items (equivalent to structured variables in D2000) is possible by specifying sparkplug_address in the format <DatasetName>[<Row>]^<ColumnName> where:

  • <DatasetName> is the name of the dataset (it can also be part of a structure, e.g. Template1->SubTemplate2->Dataset3)
  • <Row> is the column number (1..N) or the "*" character. In the latter case, it is possible to configure the Destination column to which all rows are written (values ​​from the first row of the corresponding column are written to the I/O tag)
  • <ColumnName> is the name of the dataset column

Examples of dataset item addresses:
SA=Performance[3]^ActivePower
SA=Machine2->Parameters[1]^ActivePower
SA=DHS/Formation Data->Reservoir Parameter[*]^Gas density


For output I/O tags, the value type must be specified (ST=type). Simple types are supported (not template items/dataset items):

  • Int8
  • Int16
  • Int32
  • Int64
  • UInt8
  • UInt16
  • UInt32
  • UInt64
  • Float
  • Double
  • Boolean
  • String
  • DateTime
  • Text

Note: There is no difference between the String and Text types.
Note: for Payload Type = Edge Node, the D2000 KOM process sends DBIRTH/NBIRTH messages that contain only I/O tags with a defined value type.

For output I/O tags Payload Type = Edge Node, it is also possible to specify a numerical alias (SAL=alias) as a natural number (0, 1, 2, ...) for both input and output I/O tags. The alias of the I/O tag within the station must be unique. Aliasing allows you to reduce the size of the transmitted data: in the NBIRTH/DBIRTH message, both the text identifier (sparkplug_address) and the alias are specified for each I/O tag, ; in the NDATA/DDATA messages, only aliases that are shorter than the text addresses are specifiedsent. Alias ​​is only used if Sparkplug Alias Mode = "Default (SAL=alias)".
Note: If aliases are used in the I/O tag address in Payload Type = Sparkplug Host mode, they will be replaced by aliases from the NBIRTH/DBIRTH message. However, if for some reason the Edge Node does not send these messages (and sends NDATA/DDATA with aliases), aliases can be useful for matching a text name with an alias.

The PUBLISH message created during writing contains a Topic derived from the station address. The message type depends on the station address:

  • for Payload Type = Sparkplug Host, depending on whether it is an Edge Node (NCMD) or Device/Sensor (DCMD)
  • for Payload Type = Edge Node, depending on whether it is an Edge Node (NDATA) or Device/Sensor (DDATA)

The Payload of the message contains a timestamp, a value type (type), a written value (encoded according to the specified value type), and a metric name (sparkplug_address) or an alias.

TxtI

Kotva
IN_SP2JS
IN_SP2JS
IN_SP2JS

The I/O tag is used to convert the Sparkplug payload into a JSON representation, which can then be processed, e.g., in an ESL script. Depending on the Convert Datatype/Timestamp to Text parameter, a textual representation of the value type and timestamp is also added. 
An example of value:

{"metrics":[{"datatype":3,"int_value":7338992,"name":"Corrected Vol Acc Stn","timestamp":1729664005479}],"seq":32,"timestamp":1729664005479}

After formatting into a readable form:

{
    "metrics": [
        {
            "datatype": 3,
            "int_value": 7338992,
            "name": "Corrected Vol Acc Stn",
            "timestamp": 1729664005479
        }
    ],
    "seq": 32,
    "timestamp": 1729664005479
}

An example of a more complex value containing properties and a dataset, and also displaying a textual representation of the data type (datatype_txt) and timestamp (timestamp_txt) as a result of the set parameter Convert Datatype/Timestamp to Text:

{
    "metrics": [
        {
            "datatype": 12,
            "datatype_txt": "String",
            "name": "Node Properties/Configuration",
            "string_value": "{}",
            "timestamp": 1730305529539,
            "timestamp_txt": "30-10-2024 17:25:29.539"
        },
        {
            "alias": 30064771073,
            "datatype": 5,
            "datatype_txt": "Uint8",
            "int_value": 0,
            "name": "Node Properties/Missing Param",
            "properties": {
                "keys": [
                    "usage"
                ],
                "values": [
                    {
                        "string_value": "technical information",
                        "type": 12,
                        "type_txt": "String"
                    }
                ]
            },
            "timestamp": 1730305529537,
            "timestamp_txt": "30-10-2024 17:25:29.537"
        },
        {
            "alias": 0,
            "dataset_value": {
                "columns": [
                    "topic_name",
                    "offset",
                    "length",
                    "crc"
                ],
                "num_of_columns": 4,
                "rows": [
                    {
                        "row": [
                            "N/A",
                            0,
                            0,
                            0
                        ]
                    }
                ],
                "types": [
                    12,
                    7,
                    7,
                    7
                ],
                "types_txt": [
                    "String",
                    "UInt32",
                    "UInt32",
                    "UInt32"
                ]
            },
            "datatype": 16,
            "datatype_txt": "DataSet",
            "name": "Node Control/FW Update",
            "timestamp": 1730305529537,
            "timestamp_txt": "30-10-2024 17:25:29.537"
        }
    ],
    "seq": 0,
    "timestamp": 1730305529536,
    "timestamp_txt": "30-10-2024 17:25:29.536"
}

...