Messaging Patterns for Event-Driven Microservices https://content.pivotal.io/blog/messaging-patterns-for-event-driven-microservices
Some microservices integration scenarios can be solved by simply handling lifecycle events from data persisted in a data store. In this scenario, one or more microservices subscribe to data change events directly from a NoSQL store and are notified upon data changes. Those notifications can include new data being persisted, existing data being modified or deleted. Unlike the other patterns previously mentioned, events are triggered out of data operations and the message payload is the updated data itself. This considerably simplifies event-driven models when system operations should follow data updates.
从nosql中感知、响应数据事件的变化
By subscribing to data lifecycle events, microservices acting as clients of a Pivotal GemFire or Apache Geode cluster will have their listeners triggered by changes in the persisted data. Delivering data events directly from where they are stored is used extensively in capital markets, when extreme low latency is a must. It also powers use cases such as China Railways ticketing system, where ticketing events can trigger actions in other distributed components, such as fleet logistic adjustments and the repricing of remaining seats.
While this pattern can be useful, it requires all components to agree on the context and format of the data being exchanged. Architects should be careful not to introduce unwanted coupling between microservices that exchange data events, by protecting their boundaries and clearly diving responsibilities over abounded context.
Like RabbitMQ, GemFire also has a Pivotal Cloud Foundry Service Broker and tile for a fully automated operational experience on multiple clouds.
This is not intended to be an exhaustive catalog of asynchronous integration patterns for microservices, but rather a look at common scenarios for cloud-native architectures. There's no single solution for all use cases, and embracing decentralized messaging allows more flexibility, faster iterations and better resiliency.
去中心化
As with polyglot persistence, enterprises should define their internal standards for decentralized polyglot messaging based on reference architectures goals and requirements. Each new product adoption comes with its own costs and challenges, and automating operations on multiple clouds becomes mandatory. Companies should standardize on few common patterns, implemented using reusable best-of-breed solutions over a cloud-native platform.
手机扫一扫
移动阅读更方便
你可能感兴趣的文章