Posted on August 8th, 2013 No comments
My opinion is that it’s good to reexamine your approach early and often. I currently work on a service platform that is part of a larger service-oriented approach. We have one source of data, and want to push that to many different clients.
I didn’t design the architecture or database schema, but it was fairly straightforward. We have a pretty flat object model. We do have some associations that could be implemented, but when dealing with Widget and SubWidgets we expose a lot of these relations through views.
Now, switching to an ORM, like NHibernate, kind of presupposes that you have a certain amount of hierarchy in your data model. Things belong to other things and whatnot. Unfortunately, our data model isn’t really arranged like that. But honestly that’s not a big deal. No sacred cows, here. If going to an ORM is really the right option, then refactoring and rebuilding our data model should be on the table. Since we’ve done a good job of keeping concerns separated when it comes to database access, it wouldn’t be that complicated as we wouldn’t have to change much at the service layer. We would just be changing what happens at the layers below it.
That’s all well and good, but there are some things I like about what we have right now. First of all, it’s fast. Really fast. Yeah, I know, you have to be careful for premature optimization, but in this case, where we need to be able to supply service calls to iOS, Android, Web, and another platform that has strict demands about service request call times and page load times, getting things out of the database should not be a bottleneck. On top of that, since we have a flat hierarchy with SQL views exposing related data in a single dataset, we know that at any given entry point, we are getting the right data and only the right data.
With most ORM, and NHibernate is no exception, there is always the issue of lazy loading associated data sets. There is a certain amount of overhead in fine-tuning the data loading so that you don’t accidentally pull down an entire table’s worth of data when all you wanted were a few associated rows. The converse can also happen, where you want to get two or three layers of associated records but the lazy loading kicked in and you’re not getting it fast enough.
NHibernate (unlike some PHP ORM) actually seems to handle all of this really well, with a lot of intuitive configuration and conventions to help make things performant when you need it. But it’s all this overhead. Right now, when I set up a new data contract, I can set it and forget it. With ORM, my experience has been there’s always this game of fine-tuning things. Otherwise, your code ends up generating ten or a hundred times more SQL queries than are really needed.
One of the major things I’m looking for with this, besides performance, is maintainability. Our current system has a sort of home-grown migration process. It works for now, but requires a fairly high level of SQL knowledge to implement. That puts my platform developers in the position of almost having to be a DBA to make a change to a data contract. This is certainly not an ideal situation. Fluent NHibernate would help with that immensely. We could store our mappings as C# classes, and let version control handle the revisioning. Rolling back a database change becomes somewhat problematic unless we handle any rollback as just another change set. Unfortunately, it doesn’t really help with the process of migrating data.
For these reasons, I’ll probably sit on what we have until I find something that not only abstracts the migration process, but also doesn’t require a lot of overhead in terms of performance tuning and maintenance.