Porovnávané verzie

Kľúč

  • Tento riadok sa pridal
  • Riadok je odstránený.
  • Formátovanie sa zmenilo.

...

  • Communication line category: TCP/IP-TCP or TCP/IP-TCP Redundant.
  • Host: IP address of MQTT server (or redundant addresses separated by a comma or semicolon).
    Note: In Payload Type = Sparkplug Edge Node mode, the Node Control/Next Server metric is supported, which can be used to command the D2000 KOM process to connect to the next MQTT server (if multiple servers are specified).Note: If the TCP/IP-TCP line is configured, one TCP connection is created, which can be directed to multiple specified IP addresses. If the TCP/IP-TCP Redundant line is configured, two TCP connections are created, each of which can be directed to multiple specified IP addresses.
  • Port: the default port number is 1883 or 8883 for the encrypted SSL/TLS variant.
  • Line number: unused, set the value to 0.

...

openssl x509 -text -in file.crt | grep "After"


Notes on MQTT broker redundancy

If the TCP/IP-TCP line is configured, one TCP connection is created, which can be directed to one of multiple configured IP addresses.

If the TCP/IP-TCP Redundant line is configured, two TCP connections are created (to 2 MQTT brokers), each of which can be directed to one of multiple configured IP addresses.


Forced disconnection: If all stations on the line are in the simulation mode or the communication is stopped for them, the line will be disconnected (the communication socket will be closed). If the simulation is disabled for at least one station and the communication is not stopped for it (the Parameters tab of the Station type object), the line will be connected again.

...

Type of  I/O tagAddressDescription
I/O tags for reading data sent by the MQTT server through a PUBLISH message (usually used in Text mode or JSON mode, rarely in Sparkplug mode).
Note: Values of I/O tags are set by the D2000 KOM process in the order IN_TOPIC, IN_DATA, and IN_ID. The configuration doesn't need to contain all three I/O tags.
TxtI
Kotva
in_topic
in_topic
IN_TOPIC
Topic (Topic) of the received PUBLISH message.
TxtI
Kotva
in_data
in_data
IN_DATA
Data (Payload) of the received PUBLISH message.
Ci
Kotva
in_id
in_id
IN_ID
Identifier of a packet (Packet Identifier) of a PUBLISH message that depends on the level of validation (QoS).
For messages sent with QoS_0, the identifier is zero; for QoS_1 and QoS_2, it is a positive 16-bit number. On the TCP/IP-TCP line, the identifier is monotonically increasing; on the TCP/IP-TCP Redundant line, values from two monotonically increasing sequences can alternate (so they can also be repeated), so the recommendation given in the following note applies:
Note: If the MQTT server sends also messages with the QoS_0 level of validation and the ACK_ID I/O tag is configured, then we recommend activating the option New value when changing time in the Filter tab, so that repeated writing of the value 0 will cause a new value that differs only in a timestamp to be generated.
I/O tag to confirm the received data to the MQTT server.
Co
Kotva
ack_id
ack_id
ACK_ID
If an output I/O tag with the ACK_ID address is defined, the D2000 KOM expects confirmation of the processing of each message by writing a copy of the value of the IN_ID tag. Only after, it sets values from the next received PUBLISH message (if it was received in the meantime) into the IN_TOPIC, IN_DATA, and IN_ID I/O tags (in this order). 
In the case of the QoS_0 level of confirmation, it is, therefore, necessary to repeatedly set the value of the I/O tag ACK_ID to 0. 
If the I/O tag ACK_ID does not exist, the values are written into the IN_TOPIC, IN_DATA, and IN_ID I/O tags immediately after the PUBLISH message is received and processed.
Note: For messages received with a QoS_0 level of validation, no confirmation is sent to the MQTT server; only the values of the received PUBLISH message will be published. 
I/O tags for sending values to the MQTT server through a PUBLISH message.
Note: in order for the D2000 KOM process to send the PUBLISH messages to the MQTT server, both I/O tags must be defined within one station.
TxtO
Kotva
out_topic
out_topic
OUT_TOPIC
The topic of the PUBLISH message being sent.
Note: If the I/O tag with the OUT_TOPIC address does not exist, the station address will be used directly as the Topic (if it is empty, the writing will not be performed).
TxtO

Kotva
out_value
out_value
OUT_VALUE

Data (Payload) of the PUBLISH message being sent.
Note: Sending the message is performed as a result of writing to the OUT_VALUE I/O tag (i.e., if the Topic does not change, then it is sufficient to set the OUT_TOPIC point once, e.g., by using the default value).
I/O tags for parsing JSON messages

TxtI, TxtO, Qi,
Ci, Co,
Ai, Ao,
Di, Do,
TiR, ToR

Kotva
ja
ja
JA=json_address

If Payload Type=JSON, the message is parsed as JSON data. The json_address value specifies the name of the JSON field whose value is to be assigned to the I/O tag
For JSON messages that can be structured, the syntax level1.level2.level3 ... is supported, e.g. rx.current, and if they contain fields (indexed from 1) also syntax level1[index1].level2[index2].level3 ... is possible, e.g. rx.gwrx[1].time.
Since the JSON message itself can be an array, the address can also start with an index, e.g. JA=[1].batt_cell_v_avg

For other examples, see the description of the LoRaWAN protocol's Envelope type I/O tags.


Writing to an I/O tag with a JSON address is also supported, but it must not have indexes. Examples of correct addresses for writing:

  • JA=myaddress
  • JA=mystruct.mylevel.myitem

When writing, the generated JSON contains the value itself and optionally a timestamp, if the station protocol parameter Time Field Name or the line parameter Time Field Name is set.
Note: Before writing to an I/O tag with a JSON address, a topic must first be set (an I/O tag with the OUT_TOPIC address). If the I/O tag with the OUT_TOPIC address does not exist, the station address will be used directly as the Topic (if it is empty, the writing will not be performed).

I/O tags for parsing Sparkplug messages

TxtI, TxtO, Qi,
Ci, Co,
Ai, Ao,
Di, Do,
TiR, ToR, TiA, ToA

Kotva
sa
sa

SA=sparkplug_address

SAL=alias;SA=sparkplug_address

ST=type;SA=sparkplug_address

ST=type;SAL=alias;SA=sparkplug_address

If Payload Type = Sparkplug Host/Edge Node, the message is parsed as Sparkplug data (a binary format built on Google Protocol Buffers). Sparkplug data contains metrics that have text identifiers (sparkplug_address) or possibly numeric aliases (alias).

Reading template items  is possible by specifying sparkplug_address in the format <TemplateName1><Separator><TemplateName2><Separator> ... <Separator><ItemName> where:

  • <TemplateNameX> is the name of the template/sub-template
  • <Separator> is the separator for individual levels (standard characters "->", which can be changed with the Item Separator parameter if this sequence occurs in template/item names)
  • <ItemName> is the item name of the deepest nested template

Examples of template item addresses:
SA=Template1->SubTemplate2->Item
SA=secUDT->sec

Reading dataset items (equivalent to structured variables in D2000) is possible by specifying sparkplug_address in the format <DatasetName>[<Row>]^<ColumnName> where:

  • <DatasetName> is the name of the dataset (it can also be part of a structure, e.g., Template1->SubTemplate2->Dataset3)
  • <Row> is the column number (1..N) or the "*" character. In the latter case, it is possible to configure the Destination column to which all rows are written (values ​​from the first row of the corresponding column are written to the I/O tag)
  • <ColumnName> is the name of the dataset column

Examples of dataset item addresses:
SA=Performance[3]^ActivePower
SA=Machine2->Parameters[1]^ActivePower
SA=DHS/Formation Data->Reservoir Parameter[*]^Gas density

For output I/O tags, the value type can be explicitly (ST=type). Simple types are supported (not template items/dataset items):

  • Int8
  • Int16
  • Int32
  • Int64
  • UInt8
  • UInt16
  • UInt32
  • UInt64
  • Float
  • Double
  • Boolean
  • String
  • DateTime
  • Text
  • Unknown
    (undefined type)

If the value type is not specified, the default value depends on the type of I/O tag:

I/O tag typeValue type
Dout

Boolean

CoInt64
Ao

Double

ToA

DateTime

ToRDouble
TxtO

String

Note: There is no difference between the String and Text types.
Note: for Payload Type = Edge Node, the D2000 KOM process sends DBIRTH/NBIRTH messages that contain only I/O tags (both input and output ones) with a defined value type. If we want to hide some (input) I/O tags, we need to set ST=Unknown in the address.

For Payload Type = Edge Node, it is also possible to specify a numerical alias (SAL=alias) as a natural number (0, 1, 2, ...) for both input and output I/O tags. The alias of the I/O tag within the station must be unique. Aliasing allows you to reduce the size of the transmitted data: in the NBIRTH/DBIRTH message, both the text identifier (sparkplug_address) and the alias are specified for each I/O tag; in the NDATA/DDATA messages, only aliases that are shorter than the text addresses are sent. Alias ​​is only used if Sparkplug Alias Mode = "Default (SAL=alias)".
Note: If aliases are used in the I/O tag address in Payload Type = Sparkplug Host mode, they will be replaced by aliases from the NBIRTH/DBIRTH message. However, if for some reason the Edge Node does not send these messages (and sends NDATA/DDATA with aliases), aliases can be useful for matching a text name with an alias.

The PUBLISH message created during writing contains a Topic derived from the station address. The message type depends on the station address:

  • for Payload Type = Sparkplug Host, depending on whether it is an Edge Node (NCMD) or Device/Sensor (DCMD)
  • for Payload Type = Edge Node, depending on whether it is an Edge Node (NDATA) or Device/Sensor (DDATA)

The Payload of the message contains a timestamp, a value type (type), a written value (encoded according to the specified value type), and a metric name (sparkplug_address) or an alias.

TxtI

Kotva
IN_SP2JS
IN_SP2JS
IN_SP2JS

The I/O tag is used to convert the Sparkplug payload into a JSON representation, which can then be processed, e.g., in an ESL script. Depending on the Convert Datatype/Timestamp to Text parameter, a textual representation of the value type and timestamp is also added. 
An example of value:

{"metrics":[{"datatype":3,"int_value":7338992,"name":"Corrected Vol Acc Stn","timestamp":1729664005479}],"seq":32,"timestamp":1729664005479}

After formatting into a readable form:

{
    "metrics": [
        {
            "datatype": 3,
            "int_value": 7338992,
            "name": "Corrected Vol Acc Stn",
            "timestamp": 1729664005479
        }
    ],
    "seq": 32,
    "timestamp": 1729664005479
}

An example of a more complex value containing properties and a dataset, and also displaying a textual representation of the data type (datatype_txt) and timestamp (timestamp_txt) as a result of the set parameter Convert Datatype/Timestamp to Text:

{
    "metrics": [
        {
            "datatype": 12,
            "datatype_txt": "String",
            "name": "Node Properties/Configuration",
            "string_value": "{}",
            "timestamp": 1730305529539,
            "timestamp_txt": "30-10-2024 17:25:29.539"
        },
        {
            "alias": 30064771073,
            "datatype": 5,
            "datatype_txt": "Uint8",
            "int_value": 0,
            "name": "Node Properties/Missing Param",
            "properties": {
                "keys": [
                    "usage"
                ],
                "values": [
                    {
                        "string_value": "technical information",
                        "type": 12,
                        "type_txt": "String"
                    }
                ]
            },
            "timestamp": 1730305529537,
            "timestamp_txt": "30-10-2024 17:25:29.537"
        },
        {
            "alias": 0,
            "dataset_value": {
                "columns": [
                    "topic_name",
                    "offset",
                    "length",
                    "crc"
                ],
                "num_of_columns": 4,
                "rows": [
                    {
                        "row": [
                            "N/A",
                            0,
                            0,
                            0
                        ]
                    }
                ],
                "types": [
                    12,
                    7,
                    7,
                    7
                ],
                "types_txt": [
                    "String",
                    "UInt32",
                    "UInt32",
                    "UInt32"
                ]
            },
            "datatype": 16,
            "datatype_txt": "DataSet",
            "name": "Node Control/FW Update",
            "timestamp": 1730305529537,
            "timestamp_txt": "30-10-2024 17:25:29.537"
        }
    ],
    "seq": 0,
    "timestamp": 1730305529536,
    "timestamp_txt": "30-10-2024 17:25:29.536"
}

...

Kotva
tell_cmd
tell_cmd
Tell commands

...


PríkazSyntaxPopis
STCOMMANDSTCOMMAND StationName PUTOLDVAL BeginTime EndTime

For Payload Type = Edge Node:

The Tell command reads out historical values ​​of output I/O tags (if the I/O tags or their control objects are archived) and sends them as historical values. Reading works independently of the Store & Forward protocol parameter. Using this command, it is possible to send historical values ​​even for the period when the D2000 KOM was turned off (and therefore the Store & Forward functionality could not work).
The BeginTime and EndTime parameters must be in the format "dd-mm-yyyy hh:mi:ss"

Example: STCOMMAND B.MQTT_MOSQUITTO_EN.Device1 PUTOLDVAL "30-05-2025 00:00:00" "30-05-2025 00:10:00"


Kotva
literatura
literatura
Literature

...