Skip to content

Designing a great API  

Several years ago I worked on a payroll package developing a core engine that required an API to let third parties write calculations, validations and security gates that would execute as part of it’s regular operation.

We were a small team and I had many conversations with another developer tasked with building a payroll using the API I would provide. Some methods here, classes there, the odd helper function and I had an API and then we had a mini payroll running.

Then he showed me the code he had written and that smug grin dropped off my face. It was awful.

Perhaps this other developer wasn’t as great as I’d thought? Looking at the code though made me realise he had done the best anyone could with a terrible API. I’d exposed parts of this core payroll engine with hooks when it needed a decision. Its job was to run the payroll – a very complex task that involved storage, translation, time periods, users and companies. That complexity and context had leaked out.

Unfortunately it’s not a unique story – many API’s are terrible to use. They’re concerned with their own terminology, limitations and quirks because they are exposed sections of an underlying system developed by those responsible for the underlying system.

If you want others to have a good experience with your product you have to put yourself in their shoes. Whether it’s a UI or an API makes no difference.

You are not the user

That’s the real difference between writing the classes that form your regular implementation and those that make up your public API.

We had time to fix our payroll API. Instead of refining and polishing here and there we took the 20 or so snippets developed for the mini payroll and pruned, cleaned and polished until they looked beautiful. They scanned well and made sense to payroll developers unfamiliar with our package. When a third developer familiar with payrolls but unfamiliar with out package developed the necessary code for a fully-functional jurisdiction in record time with minimal assistance we knew we had hit our goal.

Sure implementing that new API was hard work. Instead of simple methods sticking out of the engine we had a facade over our engine but it was justified. They were two different systems for two different types of user with distinct ideas about what the system was and how it was going to be used.

Code First

Many years later I found myself on a small team of 3 people tasked with putting a brand new API on top of Entity Framework for configuring models with code the .NET world would come to know as Code First. I was determined to use my experience and avoid another complex API surface littered with terminology and leaky abstractions. Parts of EF already suffered from that problem.

So for the first few weeks of that project we didn’t write any of the code that would in fact become Code First.

Instead we decided who our user was – in this case a C# developer who likes writing code, knows LINQ and some database concepts but doesn’t know Entity Framework as people who did were already using Model First or Database First.

Then we wrote tiny sample apps and tried to find simpler and simpler ways to describe them in code. We’d often start on a whiteboard with a scenario and write the complete mapping. We’d then try and find conventions that would remove the need for most of it and then try to write succinct code to configure the rest. As the newest guy to the team I’d fight to keep EF terms away from the main API surface in order to reduce that barrier to entry and help drive adoption.

Finally we’d hit the computer and develop stub classes and methods to make samples compile and let us try the IntelliSense. This isn’t always necessary but if you want to develop a fluent API or provide lots of type-safety such as Code First’s relationship mapping it’s highly recommended.

We’d then revisit the samples later and see if they could be read as easily as they were written and figure out what problems people were likely to run into and whether we could solve them without too much noise. Sometimes this meant having more than one way to do things such as chaining the fluent methods or allowing a bunch of properties to be set (solved with an extension method class providing the fluent API) and how users could manage larger models (solved by subclassing EntityConfiguration<T> – now EntityTypeConfiguration<T> sigh – and allowing redundant specification for things like relationships that span more than one class).

We finally ended up with succinct code like this with IntelliSense guiding you along the way and preventing you from even being able to specify invalid combinations. The HasMany prompts the properties on Customer and it won’t show you WithRequired unless it is valid. In the case of Required to Required it will ensure that the WithRequired specified which end is principle and dependent. In short it guides you through the process and results in highly readable code.

Entity<Customer>().HasMany(c => c.Orders).WithRequired(o => o.Customer).WillCascadeOnDelete();

This process took a little longer but given the amount of use the API will get that time will be saved by users countless times over.

Code First went down incredibly well with both the target audience and existing EF users and inspired the simpler DbContext interface that became the recommended way of accessing EF.

I think it’s one of the nicest API’s to come out of Microsoft and .NET.

[)amien

PS. Martin Fowler has some great guidance in his book Domain Specific Languages.

10 responses  

  1. Nice post. It took me a while to wrap my head around the terminology, but once I did my code read like a sentence. Which is exactly how an API should.

    Antony ScottNovember 29th, 2011
  2. Which terminology did you find difficult? We tried to keep EF’s terms out of the API and go with terms like Many/Required which we felt were simpler and people may have picked up through exposure to other tools such as UML modelling.

    [)amien

    Damien GuardNovember 29th, 2011
  3. This fluent API style always struck me as extremely nasty, except if the return value of the methods is actually a fresh, immutable object like linq does it. This is such a hack.

    Maybe it is okay because its usage is isolated to a small part of the code (the mapping). But I would really not want to use this style for an api that is used everywhere.

    Such an abusive practice. Am I overreacting?

    tobi – November 29th, 2011
  4. Fluent and lambda based API’s aren’t always the way to go – your audience needs to be familiar with them and in our case not knowing LINQ was going to be a major hurdle in using EF. If you also consider that LINQ and lambda’s are used to describe how objects are selected and processed declaratively for translation at runtime once the environment is known then object-mapping is a very similar problem. You describe the relationship and characteristics of your objects at compile time then at run-time the mapping is generated based on that description and is suitable for your target (e.g. MySql/Oracle/SQL).

    If you prefer the traditional command/property pattern you can partially use the API in that way:

    var customer = Entity();
    var customerAddress = customer.HasMany(c => c.Orders).WithRequired(o => o.Customer);
    customerAddress.CascadeOnDelete = true;

    The HasMany line is very difficult to express in terms of a command/property. The first argument here ensures it will only take a reference to a Customer object. Through generic type arguments and inference it also creates an object so that WithRequired will know it must get back to the original Customer object or in the case of WithMany a collection of Customer objects. That kind of strong type safety is impossible with properties.

    [)amien

    Damien GuardNovember 29th, 2011
  5. One downside of fluent API’s turns up when debugging. Too frequently, a debugger has limited ability to highlight which part of the fluent API expression is due to be executed next, making it hard to get context when problems occur.

    Rob Grainger – November 30th, 2011
  6. You can set the breakpoints on individual sections but yes highlighting is confusing. One solution is to put a newline before each dot, e.g.

    modelBuilder.Entity<Customer>()
        .HasMany(c => c.Orders)
        .WithRequired(o => o.Customer)
        .CascadeOnDelete();

    [)amien

    Damien GuardNovember 30th, 2011
  7. Good post, perhaps you can turn this into a series with your lessons learned – would be a great learning resource

    Steve – November 30th, 2011
  8. TDD. TDD will automatically put you in the users shoes as you will be writing the tests first. naturally you will write the tests in the most intuitive manner.

    Balbir Chahal – November 30th, 2011
  9. In my experience TDD works for certain types of small granular objects but can be difficult to develop things like the builder pattern whereby a lot of inputs are required before anything useful comes out. In the case of ModelBuilder what comes out is a DbContext – again not something that you could easily examine to see if it worked. You could argue that’s because the rest of EF wasn’t designed that way.

    [)amien

    Damien GuardNovember 30th, 2011
  10. That is a very cool project to participate in. I am jealous!

    Here is another great book on the subject of creating APIs:

    Framework Design Guidelines – Conventions, Idioms, and Patterns for Reusable .NET Libraries

    Regarding fluent methods – I love them, but a user of my own ORM commented that he preferred overloaded methods (or ones with optional parameters) so that you can more easily see all of your options. He said that with fluent, sometimes the user has to be already aware of the existence of the fluent methods before they can be used. I think this point is even more true when the fluent methods are extension methods, because then the user also has to know to include a using statement to import them. Ideally, the fluent methods are present as methods on a builder object instead of as extensions (not always possible, I know).

    Jordan

    Jordan Marr – November 30th, 2011

Respond to this