0

I'm really interested in using Redis Streams as the messaging backbone for a platform utilising event driven architecture in place of Kafka for the following reasons:

  • Lower cost
  • Drastically lower complexity (and virtually zero deployment and maintenance overheads vs managing a kafka cluster)
  • Increased performance
  • 1 fewer technology + reusing a tech that I'll already be leveraging for caching etc.

However, I'm unsure how I would manage schema evolution between the services without the equivalent of the Schema Registry that I'm used to using with Confluent's hosted Kafka solution.

I can't find much on searching around schema evolution and Redis Streams. Any help would be greatly appreciated.

Thanks in advance.

1 Answer 1

0

Coming from a SQL background, I always relied on modeling and schema evolution tools heavily. But since moving to a mixed approach in which any given data element might be in Postgres, Mongo, etcd, Kafka or redis, and for which I might need any number of different fragments of boilerplate code, I've embraced the development and use of a Jupyter Notebook template I developed using Python to manage schema universally.

I use a Python dict to define all the elements of the model I need - e.g.

someModelDict["someFieldName"] = {"Name":"Node hash name",
                                        "Description":"Host Node hash name",
                                        "Datatype":str,
                                        "ConstraintType":"FK",
                                        "Constraint":"Nodes.NodeHashName",
                                        "SampleValue":"DSFAGER3455",
                                        "Notes":"Node providing resources to the service - hash of IP Address or Hostname"}

I've written relatively simple Python code to generate sample values and queries in any of the platforms I'm using, as well as boilerplate code for web frameworks in which I need to work with the models - all based on that simple Python dict, and some additional mapping dicts (that map datatypes, commands, etc. from Python to the target framework).

If you're more of a Javascript person, then perhaps a simplistic node.js front end on JSON files that serve the same function as the Python dicts. Again, by making the originating entity for the model simple (JSON or Dict), you make it straightforward to process that with code to do things like check integrity, or audit structure or migrate from one version to another

Having the data model originate from one authoritative place is key, but when using multiple platforms, many of them NOSQL, I've found it best to 'roll your own', based on a simple but extensible structure like dict/json.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.