Skip to content

Comments

Updates for burrow v3 version of api schema.#9

Open
jbanton-dm wants to merge 1 commit intopacketloop:developfrom
dailymotion-leo:develop
Open

Updates for burrow v3 version of api schema.#9
jbanton-dm wants to merge 1 commit intopacketloop:developfrom
dailymotion-leo:develop

Conversation

@jbanton-dm
Copy link

This could use some refactoring, but it at least makes it work with newer versions of burrow that no longer support the v2 api.

if not offset:
continue
if offset["timestamp"] > offset_timestamp:
latest_offset = offset["offset"]
Copy link

@jrpilat jrpilat Apr 22, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Below this line:
offsets += [latest_offset]

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hi @jrpilat , unfortunately we're still on Kafka v2. Do you mind sharing an example json body what this looks like in v3 to help us confirm if this would break v2 or not.. Thanks!

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lenfree is this what you need?

curl -s http://localhost:8000/v3/kafka/default/consumer/test | jq .
{
  "error": false,
  "message": "consumer detail returned",
  "topics": {
    "Maker": [
      {
        "offsets": [
          {
            "offset": 247136,
            "timestamp": 1571187859717,
            "lag": 432
          },
          {
            "offset": 247175,
            "timestamp": 1571187860801,
            "lag": 393
          },
          {
            "offset": 247203,
            "timestamp": 1571187861806,
            "lag": 365
          },
          {
            "offset": 247254,
            "timestamp": 1571187862816,
            "lag": 314
          },
          {
            "offset": 247285,
            "timestamp": 1571187863866,
            "lag": 283
          },
          {
            "offset": 247323,
            "timestamp": 1571187865134,
            "lag": 245
          },
          {
            "offset": 247378,
            "timestamp": 1571187866144,
            "lag": 190
          },
          {
            "offset": 247438,
            "timestamp": 1571187867176,
            "lag": 130
          },
          {
            "offset": 247462,
            "timestamp": 1571187868263,
            "lag": 106
          },
          {
            "offset": 247492,
            "timestamp": 1571187869406,
            "lag": 76
          },
          {
            "offset": 247504,
            "timestamp": 1571187871205,
            "lag": 64
          },
          {
            "offset": 247516,
            "timestamp": 1571187872322,
            "lag": 52
          },
          {
            "offset": 247556,
            "timestamp": 1571187873389,
            "lag": 12
          },
          {
            "offset": 247566,
            "timestamp": 1571187875598,
            "lag": 2
          },
          {
            "offset": 247568,
            "timestamp": 1571187978098,
            "lag": 0
          }
        ],
        "owner": "",
        "client_id": "",
        "current-lag": 0
      }
    ]
  },
  "request": {
    "url": "/v3/kafka/default/consumer/test",
    "host": "757320aea5d7"
  }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants