Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Kai Hudalla
    thanks for the info. Looks like I will then use a merge command for updating two properties in one go ...
    Thomas Jaeckle
    we'll have to postpone the for today planned Ditto 2.0 release to next week
    still wrapping up the last PRs which should get in 2.0
    Thomas Jaeckle
    @/all on behalf of the complete Eclipse Ditto team: We are very happy to announce the successful release and availability of Ditto 2.0.0
    Thank you for all your support and especially to your (code) contributions which made it into this huge release
    Frédéric Desbiens
    @thjaeckle This is an impressive release! Congrats to the team! Do you think it could be useful to introduce the new features and changes to the community at the next IoT community meeting on May 25?
    Thomas Jaeckle
    hi @fdesbiens and thank you :)
    I will ask around as I am on vacation in that week and won't be able to present something .. let me get back to you
    Frédéric Desbiens
    Can wait for June it this is not a good fit.
    Thomas Jaeckle
    @fdesbiens I think it'll have to wait for June, yes
    Hello, does anyone know can i obtain historical state values from Ditto, I found out there are no API for that but is there any other way to obtain that data ?
    Thomas Jaeckle
    hi @Canadianboy122_twitter
    if you have access to the MongoDB backing Ditto, you can find the historical state values in the MongoDB collection things_journal
    by default, this collection is however cleaned up periodically - if you don't want that clean-up you may configure the environment variable PERSISTENCE_CLEANUP_ENABLED of the "concierge" service to false, see also:
    Thomas Jaeckle
    @/all we released a Ditto 2.0.1 bugfix today, if you are on 2.0.0, please consider updating: https://www.eclipse.org/ditto/release_notes_201.html
    Amable Valdés

    Greetings community, let's see if you can help me with a problem I have.

    I have followed this guide (https://github.com/eclipse/ditto-examples/tree/master/mqtt-bidirectional) to connect Ditto, Mosquitto and a device using the MQTT protocol. It has been a total success and I am already receiving messages from the defined topic.

    The problem I have is with a feature of my device. This device transmits a very heavy type of data (you can think of a base64 encoded audio, image or video) and it shows this exception in the ditto-connectivity microservice:

    2021-05-20 16:29:46,171 INFO  [31343174-6640-4ebd-a842-4009660f2ef1] o.e.d.c.s.m.InboundDispatchingActor akka://ditto-cluster/system/sharding/connection/1/mqtt-example-connection-123/pa/$a/c1/inboundDispatching - Got exception <things:thing.toolarge> when processing external message with mapper <javascript>: <The size of '257.071' kB exceeds the maximal allowed Thing size of '100' kB.>

    Of course, Ditto has a kiB limit on its messages that can be configurable as written here: https://www.eclipse.org/ditto/release_notes_080-M2.html#changes

    The problem is that I have already configured the ditto-limits.conf file this way ...

      # limitations regarding "Thing" entity / "things" service
      things {
        max-size = 500k
      # limitations regarding "messages" (e.g. via WebSocket, connectivity)
      messages {
        max-size = 500k
        headers-size = 5k
        headers-size = ${?LIMITS_MESSAGES_HEADERS_SIZE}
        auth-subjects-count = 100
        auth-subjects-count = ${?LIMITS_MESSAGES_AUTH_SUBJECTS_COUNT}

    and the ditto-akka-config.conf this way:

    advanced {
            # Maximum serialized message size, including header data. # default: 256 KiB
            maximum-frame-size = 1256 KiB
            maximum-frame-size = ${?REMOTE_MAX_FRAMESIZE}
            # Direct byte buffers are reused in a pool with this maximum size.

    I have deleted the containers, I have deployed it again with docker with this files edited and it still does not work... I do not know if I am doing something wrong... could it be that I am changing the value in the file instead of using system variables with export? I have tried to do export LIMITS_THINGS_MAX_SIZE=500 and after that do the docker compose but it hasn't worked...

    And, since I'm asking this, how do you see the approach of using base64 to encode multimedia? Is there a better approach to encode multimedia content for a Ditto Thing? Thank you very much for reading me ^. ^


    @amable_alisys_gitlab how did you done last step in this tutorial where you define message converter for mqtt on ditto. I only get error in parsing function.
    I have this i guess:

    curl -X POST http://localhost:8080/devops/piggyback/connectivity?timeout=10 -u 'devops:foobar' -H 'Content-Type: application/json' -d '{
        "targetActorSelection": "/system/sharding/connection",
        "headers": {
            "aggregate": false
        "piggybackCommand": {
            "type": "connectivity.commands:modifyConnection",
            "connection": {
                "id": "mqtt-example-connection-123",
                "connectionType": "mqtt",
                "connectionStatus": "open",
                "failoverEnabled": true,
                "uri": "tcp://test.mosquitto.org:1883",
                "sources": [{
                    "addresses": ["ditto-tutorial/#"],
                    "authorizationContext": ["nginx:ditto"],
                    "qos": 0,
                    "filters": []
                "targets": [{
                    "address": "ditto-tutorial/{{ thing:id }}",
                    "topics": [
                    "authorizationContext": ["nginx:ditto"],
                    "qos": 0
                "mappingContext": {
                    "mappingEngine": "JavaScript",
                    "options": {
                        "incomingScript": "function mapToDittoProtocolMsg(headers, textPayload, bytePayload, contentType) {const jsonString = String.fromCharCode.apply(null, new Uint8Array(bytePayload));const jsonData = JSON.parse(jsonString); const thingId = jsonData.thingId.split(':'); const value = { temp_sensor: { properties: { value: jsonData.temp } },altitude: { properties: { value: jsonData.alt }}}; return Ditto.buildDittoProtocolMsg(thingId[0], thingId[1], 'things', 'twin', 'commands', 'modify', '/features', headers, value);}"

    And respose:

    {"?":{"?":{"type":"devops.responses:errorResponse","status":400,"serviceName":null,"instance":null,"payload":{"status":400,"error":"connectivity:connection.configuration.invalid","message":"The message mapper configuration failed due to: syntax error (incomingScript#1) - in line/column #1/242, source:\nfunction mapToDittoProtocolMsg(headers, textPayload, bytePayload, contentType) {const jsonString = String.fromCharCode.apply(null, new Uint8Array(bytePayload));const jsonData = JSON.parse(jsonString); const thingId = jsonData.thingId.split(:); const value = { temp_sensor: { properties: { value: jsonData.temp } },altitude: { properties: { value: jsonData.alt }}}; return Ditto.buildDittoProtocolMsg(thingId[0], thingId[1], things, twin, commands, modify, /features, headers, value);}","description":"Check the configuration options of your mapper for errors."}}}}
    Amable Valdés

    Oh, yes, let me explain:

    You must send that petition as you do, that is fine. But the code on the "incomingScript" must be a javascript code without errors/bugs.

    As is all in one line is easy to have a mistake, i recomend you to use an GUI like Visual Studio or other to program the javascript code and then write it on the petition.

    In your case i see you have put

    const thingId = jsonData.thingId.split(:);

    and it must be

    const thingId = jsonData.thingId.split(':');

    the same with:

    return Ditto.buildDittoProtocolMsg(thingId[0], thingId[1], things, twin, commands, modify, /features, headers, value);}

    It must be something like:

    return Ditto.buildDittoProtocolMsg(thingId[0], thingId[1], 'things', 'twin', 'commands', 'modify', '/features', headers, value);

    to test it I have used Visual Studio Code ^.^


    @amable_alisys_gitlab Oh tnx i see it. Now it works yay. Can you say how you send message to mqtt broker ?
    I try to update value but noting happen.

    Guess this is right payload ?
    mosquitto_pub -h test.mosquitto.org -t /ditto-tutorial/my.test:octopus -m '{ "temp": 30, "alt": 360, "thingId": "my.test:octopus" }'

    Amable Valdés

    @Canadianboy122_twitter your request seen fine, but i can't be sure because to make it run I have implemented a little client on NodeJS that emulates a real device.

    (I have had to remove some parts of the code, sorry. This code is more or less functional and would have to be tweaked to make it work, but it is a good schema that lets you see what I'm doing)

    var mqtt    = require('mqtt');
    var client  = mqtt.connect("mqtt://test.mosquitto.org:1883");
    console.log("connected flag  " + client.connected);
    client.on("error", function(error) { console.log("Can't connect. Error: " + error); process.exit(1); });
    client.on("connect", function(){    
        console.log("connected  " + client.connected);
    //handle incoming messages
    client.on('message',function(topic, message, packet) {
        console.log("Message received:");
        console.log("\tmessage is: ");
        console.log("\ttopic is: "+ topic);
    function publish(topic,msg,options){
        if (client.connected == true) {
    var topic = "ditto-tutorial/my.test:octopus"
    var json = { temp: 30.77, alt: 359.341, thingId: "my.test:octopus" }
    var count = 0;
    // I subscribe myself to check on console the message send
    client.subscribe(topic, { qos:1 }); // single topic
    var timer_id = setInterval(
            json.alt= json.alt + 1;
            publish(topic, JSON.stringify(json), options); 
        }, 5000);

    If you want to follow this aproach and implement a mqtt client on NodeJS, i have follow this guide and perhaps it can help you:

    @amable_alisys_gitlab Thank you very much. I will check it :D
    Amable Valdés
    I will give you all more data about my error hoping you can help me solve it:
    • I am on Ubuntu 18.04
    • I take the project with git clone --depth 1 https://github.com/eclipse/ditto
    • I do cd on /ditto/deployment/docker directory.
    • I use export <VAR = Value> with all and if i do export -p I can see the variables declared:
      declare -x DITTO_EXTERNAL_PORT="80"
      declare -x HOME="/home/elgestor"
      declare -x LANG="C.UTF-8"
      declare -x LESSCLOSE="/usr/bin/lesspipe %s %s"
      declare -x LESSOPEN="| /usr/bin/lesspipe %s"
      declare -x LIMITS_MESSAGES_MAX_SIZE="1000k"
      declare -x LIMITS_POLICIES_MAX_SIZE="500k"
      declare -x LIMITS_THINGS_MAX_SIZE="500k"
      declare -x LOGNAME="elgestor"
      declare -x PWD="/home/elgestor/ditto/deployment/docker"
      declare -x REMOTE_MAX_FRAMESIZE="1256 KiB"
      declare -x SHELL="/bin/bash"
      declare -x SHLVL="1"
      declare -x SSH_TTY="/dev/pts/0"
      declare -x TERM="xterm"
      declare -x USER="elgestor"
      declare -x XDG_DATA_DIRS="/usr/local/share:/usr/share:/var/lib/snapd/desktop"
      declare -x XDG_RUNTIME_DIR="/run/user/1000"
      declare -x XDG_SESSION_ID="652"
    • I use docker-compose up -d to build the containers.
    • All deployed and fine.
    • I do the sudo docker attach docker_connectivity_1 to see what is loging the connectivity container. All seens fine.
    • I send the MQTT message with 257 Kb (an base64 image) and the connection docker shows this log:
      2021-05-24 09:54:45,418 INFO  [] o.e.d.c.s.m.InboundDispatchingActor akka://ditto-cluster/system/sharding/connection/1/mqtt-example-connection-123/pa/$a/c1/inboundDispatching - Resolved mapped headers of ImmutableDittoHeaders [{mqtt.qos=0, mqtt.retain=true, ditto-entity-id=thing:camera:camera, mqtt.topic=ditto-tutorial/camera:camera, response-required=false, ditto-reply-target=0, ditto-expected-response-types=["error","response"], ditto-origin=mqtt-example-connection-123, ditto-auth-context={"type":"pre-authenticated-connection","subjects":["nginx:ditto"]}, correlation-id=9a9b1454-943d-4f2c-81bb-71d5f17a39fe}] : with HeaderMapping Optional[ImmutableHeaderMapping [mapping={}]] : and external headers {mqtt.qos=0, mqtt.retain=true, mqtt.topic=ditto-tutorial/camera:camera}
      2021-05-24 09:54:47,350 INFO  [5ae7a31d-df8f-4022-ad15-be2ad2ce56bd] o.e.d.c.s.m.InboundDispatchingActor akka://ditto-cluster/system/sharding/connection/1/mqtt-example-connection-123/pa/$a/c1/inboundDispatching - Got exception <things:thing.toolarge> when processing external message with mapper <javascript>: <The size of '257.071' kB exceeds the maximal allowed Thing size of '100' kB.>
      2021-05-24 09:54:47,350 INFO  [] o.e.d.c.s.m.InboundDispatchingActor akka://ditto-cluster/system/sharding/connection/1/mqtt-example-connection-123/pa/$a/c1/inboundDispatching - Resolved mapped headers of ImmutableDittoHeaders [{mqtt.qos=0, mqtt.retain=true, ditto-entity-id=thing:camera:camera, mqtt.topic=ditto-tutorial/camera:camera, response-required=false, ditto-reply-target=0, ditto-expected-response-types=["error","response"], ditto-origin=mqtt-example-connection-123, ditto-auth-context={"type":"pre-authenticated-connection","subjects":["nginx:ditto"]}, correlation-id=5ae7a31d-df8f-4022-ad15-be2ad2ce56bd}] : with HeaderMapping Optional[ImmutableHeaderMapping [mapping={}]] : and external headers {mqtt.qos=0, mqtt.retain=true, mqtt.topic=ditto-tutorial/camera:camera}
      2021-05-24 09:54:47,398 INFO  [] o.e.d.i.u.p.m.DefaultPersistenceStreamingActor akka://ditto-cluster/user/connectivityRoot/persistenceStreamingActor - Starting stream for <SudoStreamPids [type=streaming:SudoStreamPids, dittoHeaders=ImmutableDittoHeaders [{}], burst=25, timeoutMillis=10800000, lowerBound=LowerBound [id=:_, revision=0]]>
    • The error is repeated until it breaks the container as seen in log.

    @amable_alisys_gitlab Why you think its error ? It looks like INFO logs, not ERROR, what is problem with that logs ?
    Oh i see now size problem hmmm if i find something i will edit this message.

    And i have question about messages if anyone can help. Is there any example how should Thing answer on some question. There are some examples with coffeeBrewer but did not see code how Things answer on them.

    Thomas Jaeckle
    @amable_alisys_gitlab you have to specify the environment variables in the docker-compose.yaml file.. Docker containers don't pick up the environment of the host system.
    @Canadianboy122_twitter on the link you provided there is an example in JS code an JSON how a device must answer: https://www.eclipse.org/ditto/protocol-specification-things-messages.html#responding-to-a-message
    Amable Valdés

    Hi @thjaeckle, thanks for your response. It is resolved; i have put the enviromental variable on the docker-compose.yml instead of put it on the system and all is ok. I had put the variables in the system because I was also defining the port on which ditto was listening as a variable in the system itself, but as you say it makes sense to put it in the docker compose.

    However, I feel obliged to contact you again... Once solved, when trying to transmit an image (a 1280x720 image of 138 Kb - https://i.pinimg.com/originals/7c/2f/4b/7c2f4bfbaa411a9ef5b45bd0b4214fba.jpg) using MQTT I have received this error message on the log of docker_connectivity_1:

    2021-05-25 08:58:06,047 INFO  [1eb155dc-7f6b-4e81-8cb9-867f338f37e5] o.e.d.i.m.a.AcknowledgementAggregatorActor akka://ditto-cluster/system/sharding/connection/1/mqtt-example-connection-123/pa/$a/c1/inboundDispatching/ackr13-1eb155dc-7f6b-4e81-8cb9-867f338f37e5 - Stopped waiting for acknowledgements because of ditto runtime exception <org.eclipse.ditto.base.model.signals.commands.exceptions.GatewayInternalErrorException [message='There was a rare case of an unexpected internal error.', errorCode=gateway:internalerror, httpStatus=HttpStatus [code=500, category=SERVER_ERROR], description='Please contact the service team.', href=null, dittoHeaders=ImmutableDittoHeaders [{}]]>.
    2021-05-25 08:58:06,050 INFO  [] o.e.d.c.s.m.OutboundMappingProcessorActor akka://ditto-cluster/system/sharding/connection/1/mqtt-example-connection-123/pa/$a/c1/outboundMappingProcessor - Got DittoRuntimeException 'gateway:internalerror' when ExternalMessage was processed: There was a rare case of an unexpected internal error. - Please contact the service team.

    As it says to contact to the service team, i contact you jejeje.

    I have transmited a low resolution image too (a 326x437 image of 10 Kb - https://images-na.ssl-images-amazon.com/images/I/41n1HcdPJ%2BL._AC_SX355_.jpg) on base64 and all is fine; it appears as saved on the Ditto Thing and i can extract it without problem. I have used the same code to transmit both images, so the problem is the data transmited.

    I have modify this as docker-compose.yml:

        image: docker.io/eclipse/ditto-connectivity:${DITTO_VERSION:-latest}
        mem_limit: 384m
              - ditto-cluster
          - policies
          - concierge
          - TZ=Europe/Berlin
          - INSTANCE_INDEX=1
          - BIND_HOSTNAME=
          - OPENJ9_JAVA_OPTIONS=-XX:+ExitOnOutOfMemoryError -Xtune:virtualized -Xss512k -XX:MaxRAMPercentage=80 -Dakka.coordinated-shutdown.exit-jvm=on -Dakka.cluster.shutdown-after-unsuccessful-join-seed-nodes=120s
          - MONGO_DB_HOSTNAME=mongodb
        # Set additional configuration options here
        # -Dditto.connectivity...
          - LIMITS_THINGS_MAX_SIZE=500k
          - LIMITS_MESSAGES_MAX_SIZE=1000k
          - REMOTE_MAX_FRAMESIZE=1256KiB
        command: java -jar starter.jar

    If you need more data about the error I will be happy to provide it ^.^

    Thomas Jaeckle
    @amable_alisys_gitlab did you put that environment variables to all services in the Compose file? You need to 😉
    Tim S
    Real easy question here: Would it be easy to log the state of virtual devices over time? I've never used Ditto or Prometheus before but it in the docs it looks like prometheus would only log ditto's state, not the states of the devices it represents. So can that be configured easily?
    Amable Valdés
    Thanks @thjaeckle, it was that, i needed to put the environment variables on all the services on the docker compose ^.^
    Alexander Wellbrock

    I tried to upgrade from 1.5 to 2.0.1 but I'm experiencing a severe performance drop in the gateway service. It's load rises by approx. 300% and it starts to print WARN messages about slow HTTP connections. Any idea how to troubleshoot this or even what could be the cause?

    I also notice new JWT parser errors (eclipse/ditto#1073) but I'm not sure if that is related. Sometimes requests come through and are processed as expected but most of the time from the client side of things I only see 503 errors. If this is a bug I'll open an issue but maybe it's my specific setup / config and I just don't see cause

    Thomas Jaeckle
    @lionax_gitlab we did not encounter such a problem in the gateway ourselves - and we run that version on several environments with different usage patterns
    could you maybe activate debug logging and with that hopefully find out which code prints a lot of the same messages?
    I also don't think that #1073 is related, but thx for reporting
    Alexander Wellbrock
    @thjaeckle thanks for your input, I'll run the service with debug logging and see what it turns up
    Alexander Wellbrock
    Let me ask your opinion on this: what do you think about adding a field "prototype" to the thing and feature level containing thingId/feature-path/json-pointer format? I'm thinking about using existing things as prototypes for new things. At first I'd just want to be able to place those references in the thing so my management-service creating, updating and house-keeping things can easily access that information. Later this could be extended such that ditto "clones" the things / features into new things. My goal here is to have a place to prepare some configuration of features or things, then use those prepared "blueprint" / "prototype" things as base for a bunch of newly created things and eventually patching/updating those things later on, hence the requirement for a lasting relationship / pointer in the things or features (that last part is most probably out of dittos scope)
    Thomas Jaeckle
    @lionax_gitlab why would you need another "prototype" field in the thing JSON? why not just use a thing itself as prototype for other things ..
    the same can already be done with policies: https://www.eclipse.org/ditto/protocol-examples-creatething.html#creatething-with-copied-policy-by-policy-id
    I have to admit that documentation of this "_copyPolicyFrom" is missing .. only some examples show that
    but same approach could IMO be used for your use-case
    Alexander Wellbrock
    For an initial copy that is true. My point is more around the idea of keeping that relationship so I can later see from which "prototype" this thing is copied and eventually also use that information to update the thing based on a diff to the "prototype". Currently I'd have to store that relationship in an attribute (which is totally fine for now). Hmm. I suppose a copyFrom method for things in the scope of ditto would make sense in this regard, while an additional "prototype" field does not - I can use an attribute for this quite niche use-case and there probably is no actual use-case in ditto itself to make use of such a persisted relationship
    Thomas Jaeckle
    @lionax_gitlab I also don't yet see that "prototype" concept being a core feature of Ditto
    Alexander Wellbrock
    Another thing, about metadata: I mentioned in the past that when creating larger things the attached metadata can get a bit out of hand: the largest thing requires around 300k of header size to transmit the necessary metadata for a full update. That is a problem for several reasons: header limits are way lower than this in all sorts of places including most reverse-proxies. It is also unclear to me yet if headers as large as this have a major performance impact. I'm wondering if there should be an alternative API approach to managing metadata or if it's just not meant to be used that excessively. Currently I'm splitting such large updates (which becomes easier with the patch feature) so a with a reasonable limit of 32k header size the update is split into 10 messages. That's a bit inconvenient though, so what do others think about this?
    Alexander Wellbrock
    I'm using the extraFields option in WebSocket connections and now I'm wondering if I can dynamically reduce the metadata sent to only those that match / affect the fields that are transmitted in "values" of a message. By default I receive all metadata of all properties of a thing which increases message size considerably
    Thomas Jaeckle
    @lionax_gitlab I think the current metadata approach was not really intended to be used having more metadata than actual data?
    do you put model related information in there which is the same for a lot of things? that would IMO be not a good use-case for metadata,
    and regarding the second question: no, that's not yet supported
    Alexander Wellbrock
    @thjaeckle hmm. I thought as much :D Metadata proofed a good way to store model-related data per datapoint which did not fit into vorto. We found the mapper feature of vorto unfitting for our use-case and the amount of metadata one can attach to datapoints in vorto is limited. Of course we didn't want to develop another backend to hold part of the model, scattering it about, if it can be avoided. Ditto metadata seems like a reasonable approach because: clients don't need to implement another data source for model data that has to be managed and introduces more requests; the data is attached directly to a datapoint and can be send with it, reducing the need to wait on side-way requests to the model to complete before being able to process the events in other systems; an alternative that came to my mind is to use a certain structure in a features properties (maybe properties/status, properties/configuration or even properties/metadata) or a new feature all together and to then have property paths as keys and model + metadata specified that way. This however seemed overly complicated and something that the metadata as part of the things model solves
    Alexander Wellbrock
    metadata is in our case used for e.g. attaching a measurement unit (so we don't have to do another query to vorto), configuring a alert severity per property, configures various view options per property for frontends (e.g. viewable, writeable), if a property should be persisted in history, validator options like min,max,stepsize. While many of these things can be configured in vorto, it's not all of them so it does not fit our use-case entirely hence the ditto-metadata approach. How do you approach such property-based metadata? The thing is that all those values might have a "factory-default" specified in a vorto-model (where some of this metadata is initially generated from) but can be configured later on by several sources. While I know that as it is implemented in ditto is not ideal for our use-case either (but at least more flexible than vorto), so let me ask: is it a valid use-case that might encourage further development of metadata API in ditto in this regard or are there strong arguments that this approach is really problematic and not in the interest of the ditto project?
    Thomas Jaeckle
    @lionax_gitlab I get the intention and I get why at the first look this looks like a good fit (I also thought so) - however, having that duplication of model information which is the same for many many things redundantly as part of the thing instances would be a huge waste of DB storage / memory, etc. ... and also of metadata header size limitations :)
    for that (static) model information the "definition" was intended to be used ...
    maybe another idea: could you put in the "definition" the thingId of a thing containing the model information and reference that from all things sharing the "same" model?
    Thomas Jaeckle
    regarding the further development of the metadata API in this direction: I don't think that e.g. a self-contained metadata HTTP API would be in our current interest - then we would have 3 "top-level" and very prominent concepts (attributes, features, metadata) where to put data, which is too confusing IMO
    Amable Valdés

    I must ask a "Thing" (hahaha): Is there a limit on the MQTT and HTTP request size? Could i, for example, put on the environment variables the LIMITS_THINGS_MAX_SIZE, LIMITS_POLICIES_MAX_SIZE, LIMITS_MESSAGES_MAX_SIZE, REMOTE_MAX_FRAMESIZE as GB of data? or there is a limit?

    I am trying to send a thing of 2 MB of data by HTTP and i had the "413 Payload Too Large" response. I have put on the nginx configuration file a line client_max_body_size 0;, but this error keep appearing.

    Alexander Wellbrock

    @amable_alisys_gitlab hey there! While you sure can increase those limits you'll hit hard limits eventually since those limits ensure that data sent through the ditto cluster does not become too huge for the cluster to manage. You could end up having huge performance issues increasing those limits to much. If you have to get larger amounts of data through the cluster I found that finding ways to get the data "around" the cluster are much more viable. For an example to by-pass the cluster you can have a look at https://www.eclipse.org/ditto/advanced-data-by-pass.html

    A more straight forward approach would be trying to find an architecture for your use-case in which clients don't have to request data through the ditto cluster

    @thjaeckle hmm, well. Some of this metadata is not as static as you'd think. E.g. the display unit, min, max values or other property specific parameters could change dynamically either through user interaction or device context.
    Using a template / model / prototype thing with a link in the definition field could be a way, I have to think about that for a bit
    Thomas Jaeckle
    @amable_alisys_gitlab this is really a bad idea .. such large messages are not transported in a "streaming" way and when configuring such huge limits you would need to run with very big heap space settings as the messages are always completely read from / written to the heap memory
    the HTTP "413" most likely comes from Akka HTTP config which Ditto configures to 1 MB:
    @lionax_gitlab I understand .. sometimes the unit (and other min/max values) is defined by the device sending the data and not the model .. makes sense
    I however wonder if in Ditto's Thing a "normalization" to a unit defined in the definition should be applied then .. the structure is also "normalized" to attributes and features (e.g. by payload mappers)
    Alexander Wellbrock
    @thjaeckle normalization is not an issue in this case, we already do that (doesn't matter if by payload mappers or other means). We mostly use the metadata for streaming data to process messages in other backends without the need to set off and wait on other model-related requests. So the metadata mostly holds frontend display information / configuration; database targets; alerting information. For the sake of fewer requests and having it "all in one place" we started to store those fields as metadata in ditto
    Amable Valdés
    Thanks for your answers @thjaeckle and @lionax_gitlab ; don't worry, we will not transfer GB of Data with Ditto. I was just curious if i could do it jejeje
    However, we are going to try to transmit a device with a lot of sensorics and with cameras. It will be around 1-2 Mb of data.
    I don't think I will have performance issues with a 2 Mb max size, but I will keep an eye on the metrics of our machines. I will also study more carefully the Data By-Pass Pattern. We are doing something in that style and i hadn't seen the page you paste before. Thanks.
    For the cameras, each point in time we take the image that is displayed at that moment and save it in ditto as base64. I wanted to ask you how you see this approach and if this could give a performance problem or if there is a better method to deal with the data from cameras or microphones (multimedia content in general) with Ditto
    Thank you very much to both of you!
    Thomas Jaeckle
    @amable_alisys_gitlab if storing camera images as base64 in Ditto works for you with reasonable performance, I guess it is ok ..
    however I must say, that for serving images something like a CDN with proper HTTP caching in place would be better suited ;)