How to detect slow consumers when streaming real-time events

Hi folks! In the context of updating a “real-time” UI what is the “best” way to dynamically apply back pressure for slow consumers?
Listening to heartbeats doesn’t seem to work for me.
I would be happy to intermediate client socket writes.

So far my team have successfully used Vaadin for CRUD style app development. Using 24.x I would like to start streaming “real-time” events at scale.

Slow consumers on low bandwidth connections can easily start backlogging. To partially mitigate backlogging I am manually pushing every ~100ms while conflating updates. One low-level way is to detect blocking of socket writes. However, I am unsure how to navigate Vaadin abstraction layers and intermediate the client socket. Can anyone share advice on this use case?

I don’t have any big insights to offer other than that updates every 100 milliseconds sounds like way too often

JFYI with manual push mode, after applying rate limiting, consisting of session-wide token bucket and debouncing explicit push() calls, it is possible to stream updates with minimal backlogging regardless of network bandwidth conditions i.e. just 8Kbit/s.

I did observe one anomaly: the heartbeat listener only seems to receive callbacks when there are at least two or more active Vaadin sessions. Therefore I had to implement my own ping/pong, leveraging executeJS, to replenish the token bucket and effectively detect if the web socket is backlogging. Would be great to remove this cruft.

When pushing is it possible to limit push granularity to only certain component trees / statenode graphs?

Not that I’m aware of.