(Header image)

A brief history of pointless mappings

Posted at 10:00 on 04 March 2019

Throughout my career, I've worked on many projects, in .NET as well as with other platforms and frameworks. One particular practice that I've encountered time and time and time again in .NET, which I rarely see elsewhere, is that of having a separate identical set of models for each layer of your project, mapped one to another by rote with AutoMapper.

It's a practice that I detest with a passion. It adds clutter and repetition to your codebase without delivering any benefit whatsoever, and gets in the way of important things such as performance optimisation. In fact, if you suggested it to a Python developer or a Ruby developer, they would probably look at you as if you were crazy. But many .NET developers consider it almost sacred, justifying it on the grounds that "you might want to swap out Entity Framework for something else some day."

But why should this be? How did speculative generality end up being viewed in the .NET ecosystem as a Best Practiceâ„¢? In actual fact, there are historical reasons that, in the dim and distant past, were very real concerns.

Back in the early days of .NET, round about 2001/2002, the best practice that Microsoft recommended was to use stored procedures for everything. It didn't take long for everyone to start complaining on the ASP.NET forums about how cumbersome this was. Half of the .NET community had come from Borland Delphi, with its RAD tools letting you drag and drop data sources and data grids onto a form, while the other half had come from Java, which already had O/R mappers such as Hibernate. To go from either of these approaches to hand-cranking stored procedures, with all the tedious repetition that it involved, was like going back into the stone age.

Naturally, a whole lot of two-guys-in-a-garage ISVs were more than willing to step into the gap with a slew of ORMs. By 2004, we had Entity Broker, Pragmatier, WilsonORMapper, Objectz.net, Sisyphus, NPersist and a host of others that have long since been forgotten. They were coming and going like nobody's business, and you couldn't rely on the one you chose still being around six months later. With this being the case, abstracting out your ORM "just in case" you needed to swap it out for something else seemed like an eminently sensible -- if not vitally necessary -- suggestion.

Within a couple of years, things started to settle down, and two market leaders -- the open-source NHibernate and the commercial LLBLGen Pro -- emerged. These both quickly gained a solid backing, and they are both still going strong today.

But there was nothing from Microsoft. In the early days they promised us an offering called ObjectSpaces, but it was subsequently abandoned as vapourware.

This was a problem for some people. Right from the beginning, the majority of .NET developers have worked in companies and teams that wouldn't touch anything that didn't come from Microsoft with a barge pole if they didn't have to. But working with DataSets and stored procedures was so painful that they held their noses and used NHibernate anyway -- but wrapped it in an abstraction layer in the hope that they could swap it out for Entity Framework the moment that the latter became stable enough for them to do so.

Entity Framework finally appeared in 2008, but the first version was so bad that many in the .NET community started up a vote of no confidence in it. It was 2011 -- ten years after .NET 1.0 was first released to beta -- before Entity Framework was good enough to see serious use in production, and a further two years before it reached a similar level of functionality to NHibernate.

Nowadays, of course, Entity Framework is well established and mature, and although there are differences between EF6 and EF Core, the only thing these days that you're likely to want to swap it for is hand-crafted SQL for performance reasons -- and that usually means cutting right across your neat separation between your DAL and business layers altogether. Even testing is scarcely a reason any more now that EF Core has an in-memory provider for the purpose.

But old habits die hard, and by the time we got here the practice of abstracting your O/R mapper on the grounds that "you might want to swap out your data access layer for something else" had become deeply entrenched as a Best Practice. Many of its advocates are too young to remember its historical context, so they aren't aware that it is aimed at a use case whose likelihood has nosedived. Nor are they aware that although we once had a good idea of what we'd have to swap our DAL out for, nowadays all we can talk about are unknown mystery alternatives. But this is why we constantly need to be reviewing our best practices to see whether they still apply. Because if we don't, they just fossilise into cargo cult programming. And that benefits nobody.