Skip to content

Big parameterised queries do not work with analytics #10

@MarieSaphira

Description

@MarieSaphira

Describe the bug

When using the parameterised queries the message size is limited. So inserting

{"parameterTypes":["uuid","datetime","datetime","string","int","polygon"],
"query":"insert MeteologixWarnings { @id: ??id, time_start: ??mtime_start, time_end: ??mtime_end, warningType: ??mwarningType, severity: ??mseverity, area: ??marea } ",
"parameterNames":["id","mtime_start","mtime_end","mwarningType","mseverity","marea"],
"boundRows":[["5f2e3221-a0ae-4189-b06d-096b53520ec6","2020-12-19T00:00:00Z","2020-12-19T23:59:59Z","gust10m","0","POLYGON ((3.417 55.170308641975005, 3.51875 55.170308641975005, 3.51875 55.30487654321, 3.417 55.30487654321, 3.417 55.170308641975005))"],
...
]}

with more than ~3000 boundRows throws the following error in the API:

java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.RecordTooLargeException: 
The message is 2584882 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

Workaround

Either split queries only using ~3000 boundRows or use api/noAnalytics/update endpoint.
Maybe the API also should have a look @Stratidakos

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions