On Integration: The vision of a single database
Before Web Services, there was CORBA. Before CORBA, there was DCOM. Before DCOM, there was RPC. Before RPC, there was BSD sockets. Before sockets, there were databases. And as it was in the beginning, so shall it too be in the end.
The only systematically successful strategy in the history of computing is databases. I have discovered more and more lately that integration using a database is well-defined (DDLs - a WSDL that works!), flexible (views and triggers can hide many old sins), well-supported (today, powerful Object-Relation Mapping tools should be de regur for any sensible project), and performant (sooner or later, you’re gonna hit the database anyway). Using modern database features it can be made secure and scalable as well. In the end, databases are the best thing since, well, since databases.
I want to write a series of blog posts detailing strategies I use and explore to make database integration work. For now, let me just share my vision with you: One huge enterprise database that appears flat to any application that uses it. All applications in the enterprise using the single database instance. This vision has many practical issues in terms of performance, security, maintainability and understandability. I will spend the blog posts exploring these issues.
First, though: Where is integration using databases applicable, and how does it relate to the main buzzword of the day, Service Oriented Architecture (SOA)?
Database-based integration is only applicable for applications that are not distributed across multiple organizations or distributed widely within the same organization. This is what we can call “application-to-application” (A2A), as opposed to “business-to-business” (B2B). For B2B, technologies associated with SOA are still going to be your best bet. Also, I would not use database integration from desktop clients (“2-tier architecture”). I am not sure whether this is just because everyone has been so excited about the 3-tier architecture for so long. Maybe you could make it work. However, I don’t much care for desktop clients, so I will leave this subject to someone else.
However, when the “services” are just internal services between different parts of my application portfolio, I think integration on the database layer is greately underused.
Comments:
chwlund - Oct 8, 2006
I guess this is maybe coming in the next articles, but can you explain a bit more on how this should work? the way I understand your strategy is that you want to have all your web servers / applications servers running different kinds applications and hosting all sorts of domain objects to use the same database to communicate? then what about long-running transaction? processes that involves humans?
and isnt this like taking application design back to the 80s? is it possible to come up with a unified database design that can meet the requirements of the different applications? I get this utopian ERP-like feeling here? isnt it just so that a large/medium organization always will have some number of different applications running on different technologies with ownership from the different departments?
in my opinion I think integration higher up in the application stack is a better idea, for instance in an integration layer or service layer!
Johannes Brodwall - Oct 8, 2006
Thank you for the comment.
I am not sure I understand you concern about long-running transactions. These issues are the same no matter what technology you use to integrate, I think? Compensating transactions are not tied to web services or remoting in any particular way.
Anyway, yes, in may ways, this is a “back to the 80s” vision. I do think that many of the ideas that came out of computing in the 90s were indeed steps in the wrong direction, and we have to rethink them. The monolithic system design of the 80s do present significant challenges of scale, like you point out. These challenges are indeed what I intent to address.
A2A integration with a (remote) service layer has in my experience proved to cause much more harm than good in terms of productivity, performance, reliability and complexity. This is what I intent to explore in my next post.
Johannes Brodwall - Oct 19, 2006
I have actually never seen a system implemented with a BPM-tool, my knowledge of such tools is limited to articles on how it might work, so I can’t really compare the techniques well.
A BPM using database-integration would indeed be a strange monster. Since you can’t drive the control flow from the database (without going insane, at least), So the control would have be be decentralized. I don’t know how current tools deal with this.
Anyway, it would be really interesting to hear about your experiences with BPM in a real project (I subscribe to your blog now). I suspect that hard parts aren’t what I’d intuitively think they’d be.
chwlund - Oct 19, 2006
after thinking it through I agree that those issues are the same independent of technology. sorry about that! but I am one of those sick bastards that likes the idea of implementing long-running business processes in a separate process layer, using some kind of BPM-tool or sometimes an integration tool and then I would rather like to integrate closer to that layer than doing this all the way down in the database layer…
chwlund - Oct 19, 2006
I am really looking forward to experience a project with a BPM-tool myself, but I am quite new in the business so I havent got a chance yet, but I have tried a few in my spare-time and some of them seems very promising. Biztalk is very powerful and flexible, and is great for both integration processes, messaging, and more complex human-driven business processes. a great powerful java open source bpm tool is http://www.intalio.com/. I will post more comments on my experiences with these and others on my blog, so you are welcome to subscribe! in general the bpm market in norway is starting to grow, more companies want to have control over their processes, so I think the process oriented way of thinking about certain applications is coming more and more! later!