concurrency - Chat bots: ensuring serial processing of messages on a per-conversation basis in clustered environment -
in context of writing messenger chat bot in cloud environment, i'm facing concurrency issues.
specifically, i ensure incoming messages same conversation processed 1 after other.
as constraint, i'm processing messages workers in cloud environment (i.e worker pool of variable size , worker instances potentially short-lived , may crash). also, low latency important.
so abstracting little, requirements are:
- i have stream of incoming messages
- each of these messages has 'topic key' (the conversation id)
- the set of topics not known ahead-of-time , virtually infinite
- i want ensure messages of same topic processed serially
- on cluster of potentially ephemeral workers
- if possible, reliability guarantees e.g making sure each message processed once.
my questions are:
- is there name concurrency scenario?.
- are there technologies (message brokers, coordination services, etc.) implement out of box?
- if not, algorithms can use implement on top of lower-level concurrency tools? (distributed locks, actors, queues, etc.)
i don't know of widely-accepted name scenario, common strategy solve type of problem route messages messages same topic key end @ same destination. couple of technologies you:
- with apache activemq, hornetq, or apache activemq artemis, use topic key jmsxgroupid to ensure messages same topic key processed in-order same consumer, failover
- with apache kafka, use topic key partition key, will ensure messages same topic key processed in-order same consumer
some message broker vendors refer requirement message grouping, sticky sessions, or sticky message load balancing.
another common strategy on messaging systems weaker delivery/ordering guarantees (like amazon sqs) include sequence number in message , leave destination resequence , request redelivery of missing messages needed.
Comments
Post a Comment