Sometimes speakers make me frustrating when they talk about software architecture and forget about application lifecycle. It happened yesterday at local meet-up.
Gennady was talking about possible duplication of security functions, his topic was the development of secure software, and he presented the picture like that:
Telling the story about avoiding duplication between 2 levels — application server and RDBMS, he told it’s achievement of professional database development team the implementation of security at DB level in favor of application server level.
Because the database is the central…
I’m very sceptical about a long-term success of ideas like that.
Similar case I had at my last job. There was legacy client-server system + big webshop, connected to the same database. And there were problems with scaling out the database (even replication didn’t make life easier) because tons of business logic was performed inside of it. Because of years of development by professional database development team.
What they really tried to achieve?
While the new type of workload was introduced, they tried to leverage existing skills too much and they made the life of previous application architecture longer. But hot happier.
The right question
Relevant question is — how can they be sure, that old architecture is adequate to the new type of workload?
Very probably, because research was targeted to avoid security function duplication, decision makers were focused on this and other limitations of architecture were undiscovered.
Doing so, staying at previous architecture with the new type of workload, in most cases, you end up with a monster application with super-long lifetime and extremely high cost of it’s replacement in the future. Why do this, having complex research with security integration?
Why not to cut technical debt by embracing of microservices as others do? No one say you need to change system at once and duplication of things is ok.