Shipping a product at Microsoft: Kiota

Apr 7, 2023

Introduction

We recently announced the general availability of Kiota, an open source client generator for REST APIs with an OpenAPI description. I am the co-founder of this product, and this launch matches my 3 years anniversary at Microsoft, so I thought I would take the time to write about my experience of taking a product from idea to v1 at Microsoft.

Throughout my career, I have been involved with multiple “products”: a second-hand marketplace for industrial hardware, an enterprise social network solution, a localization and translation solution for SharePoint, and an IoT based temperature management optimization solution.

Timeline

Foundation

Kiota started as a proof of concept with Darrel Miller over the winter celebrations break of 2020. We wanted to validate whether it would be possible to build yet another OpenAPI based client code generator because after running a thorough evaluation of existing solutions, none would scale enough.

Microsoft Graph is one of the world’s largest public APIs in terms of endpoints which comes with extra requirements for code generation. Our team is responsible for shipping client experiences (SDKs, tools, etc.) and a few points motivated us to build a new generator:

  • The aging code generator was difficult to maintain and made any innovation costly.
  • Our team was the only one contributing to that old code generator and the ramp up was steep.
  • There was demand for additional languages and additional “API patterns”.
  • We are still dealing with challenges caused by adopting another OpenAPI based code generator.

We set a couple of ground rules to avoid repeating past mistakes:

  • Rely on industry standards and contribute to back to them.
  • The generated code should only rely on a set of abstractions.
  • Separation of concerns between serialization, authentication, and HTTP aspects.
  • Open source is the only way such a tool is going to be successful.
  • Simple to use. APIs are a complicated space, let us keep the tool simple.
  • Built in a way that it supports a diverse set of programming languages.

One thing that positively surprised me at Microsoft is the fact that even though it is a large company, it encourages these initiatives. While the ship takes time to steer in any direction, there is power below deck. Navigating the large organization can also be complex at times but after a couple of discussions, our organization committed to kiota.

Ramping up

While getting the support of user experience/documentation/legal/privacy/security/marketing professionals results in a more finished product, it comes at a coordination price. Thankfully, we are working with great Program Managers, and they make sure all the ducks are in a row.

We spent time onboarding additional developers which represented a challenge as we also were figuring out fundamental aspects of kiota. These aspects vary from mapping models expressed in OpenAPI descriptions to implementing cross cutting concerns (retry/redirect handling, authentication…). We developed three languages up front to evaluate our solutions and make sure we were not making core generation engine decisions that favourited any language.

Innovation

Kiota unlocked rapid innovation for our team. I initially implemented Go generation in only two weeks of Fix Hack Learn (FHL) while learning the language itself. This demonstration of the portability of the engine to additional languages helped unlock resources for PHP, Python. Since then, we have seen significant adoption of the Go SDK preview.

Two interns mostly developed ruby. While code generation is complex by nature, it showed the engine was approachable to even the most junior team members.

While Kiota will remain a CLI first as CLIs empower users through repeatability, most people prefer GUI experiences. Keeping developers in their favorite editor goes a long way for discoverability and productivity. During the last FHL week, I explored building a Kiota Visual Studio Code extension. The demo of that extension triggered conversations, and we shifted our initial plans so we can fast track releasing a preview.

Open-source contributions

Working on kiota has allowed me to contribute back to our dependencies. Seeing how communities outside of the Microsoft realm collaborate grows has been refreshing.

As a recent development, Red Hat started contributing A LOT to kiota over the last few months. They are not the only ones of course, and we have other “big names” evaluating kiota for their client experience strategy. Having the opportunity to collaborate with such a juggernaut of the open-source world is a learning experience.

Releasing

Releasing is bittersweet.

Releasing comes with the angst of the reception, the worry of getting things right, the pride of having built something and the hope people will notice your work. I am guessing any creative person experiences similar feelings when revealing their work to the public.

The ever-evolving nature of software development might be the major difference with other creative domains. Before the release, the development team perceives “v1” as the finish line, when it really is just another step in the product lifecycle. Soon you start planning the next minor release, and noticing the things mandating a major release you wished you had not missed.

While I am a huge advocate for remote work, not having the ability to go celebrate the release with the team in person afterwards was a bit disappointing. Our team members are based in Nairobi, Bristol, Montréal, Redmond, and other places. We will have celebrations to catch up on during the next offsite.

Standardization work

To work on the core kiota generation engine (the part all languages share) I had to learn about new standards and even got opportunities to contribute back. There is something humbling to improving, even in a minor way, the work of the founders of internet.

OpenAPI & JSON Schema

Kiota relies on OpenAPI to understand any REST API and generate clients. OpenAPI itself leverages JSON Schema to describe request and response bodies, even though JSON schema is not an IDL strictly speaking. I had the opportunity to gain experience a lot above both standards and identify their shortcomings when used for the purpose of generating clients. I first presented a session on this topic at the API Specifications Conference.

Conversations followed the session and lead me to help cleanup and document widely used formats and funnel that feedback to the OpenAPI folks as they are working on the next version. As I learn more about these standards and the way those communities operate, I am looking to get more involved over time and contribute more actively.

OData

Microsoft Graph relies on OData as its design “language” to describe APIs, patterns, and conventions. When a product team wants to add a new set of APIs to Microsoft Graph, they do it through CSDL, OData’s definition language. OData comes with conventions, and learning those to understand their implications can be difficult. I cannot recall how often I have been confused about the ContainsTarget attribute over the years… In depth OData design training resources are scarce. Working at Microsoft has been a big advantage as I get to work with one of the conceptors of OData.

RFC 6570

Kiota uses URI templates to build URIs before making the requests to the service. I had the chance to spot a mistake in the specification and submit an erratum.

Standardization takeaway

While working on Kiota I deepened my understanding of the web through reading additional standards like IEEE 754, RFC 3389, RFC 9110 and more. I believe every developer should read the standards their technical stacks depend on. Yes, it is large literature that can sometimes be hard to read, but not doing so is usually costlier to fix. School did not teach me much about reading standards (the focus was on practical skills, and theorical background) and it feels like an important way to grow technically.

Conferences and online resources

Conferences

Some of you might know that I used to participate to about 12 to 15 conferences a year. Since I have joined Microsoft, I have only been presenting at a handful of events: API Specifications conference, Collab Days Belgium, and Caribbean developer conference. I will not pretend I miss the traveling; I do miss the people and the enthusiastic conversations that followed.

I am still struggling to reconciliate the amount of time spent in transportation with the limited reach of those events when compared with online events. My lower back health has vastly improved for not being crammed into an airliner seat multiple times a month. I think I will keep my traveling to only a couple of out of town in person events a year.

User groups

User groups seem to have fully transitioned to a hybrid mode in North America and will not come back to in-person only events. The shift to remote work explains this change, people who used to come to the office are now working from home and user groups maintained the ability to attend online out of habit.

This trend broadens the audience, reduces environmental impact, and gives time back to people. I have had the opportunity to spend more time with local communities, it is a fantastic way to “test-out” new sessions as a public speaker.

Online events

I have had the chance to present at multiple online events over the last couple of years. I really enjoy the efficiency of these events as the audience is larger, and they share recording afterwards. However, not having audience feedback as well as not being able to talk to people afterwards is frustrating to me.

Conclusion

Thank you for making it all the way to the end. Do not hesitate to leave comments, and until next time!


Last edited Apr 15, 2024 by Vincent Biret


Tags: