Content Developer II at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
121982 stories
·
29 followers

Get ready for Microsoft Build: Sharpen your skills

1 Share

Microsoft Build is nearly here! To get ready for the big event, we're exploring resources that will help you get a head start on key topics by building new skills with the latest tools and technologies. This selection of resources delves into cloud-native development, building intelligent apps with AI, working more productively with GitHub Copilot, and more. Plus, we have other timely news and announcements for developers from around Microsoft. 

 

Training & Assessment

Implement security through a pipeline using Azure DevOps
Find out how to secure your Azure pipelines. Develop key skills, such as configuring secure access to pipeline resources, validating permissions, and managing identity for projects. Work through the Microsoft Learn path and then take the assessment to earn your credential.

GitHub Copilot fundamentals: Understand the AI pair programmer
Get started with GitHub Copilot to help enhance your productivity and foster innovation. This Microsoft Learn path will show you how to implement Copilot in your organization and use it in your own projects.

Accelerate Developer Productivity with GitHub and Azure for Developers
Explore GitHub and Azure with this collection of curated resources. Get started with GitHub Copilot, learn how to use GitHub with Visual Studio Code, implement Microsoft DevOps, and more.

Applied Skills: Build an Azure AI Vision solution
Learn how to use pre-trained models to analyze images and extract insights and information from them with Azure AI Services. This free learning path will help prepare you for Exam AI-102: Designing and Implementing a Microsoft Azure AI Solution.

Applied Skills: Deploy containers by using Azure Kubernetes Service
Check out this learning path to learn about deploying containers and managing clusters on Azure Kubernetes Service.

AI learning journey for data professionals
Are you a data professional? This Microsoft Learn collection can help you prepare for AI solutions. Explore the basics of AI and gen AI, learn about working with Microsoft Copilot, and more.

Get started with Microsoft Copilot for Microsoft 365 extensibility
Do you know how to extend Microsoft Copilot for Microsoft 365? This new learning path delves into Copilot and explores options for extending it.

Azure Developers Hub
Check out the new Azure Developers Hub—the place to explore everything Azure for developers. Find the latest updates, tools, and code samples to jump-start your next project. Level up your skills with free tutorials and training modules.

 

Events & Challenges

Visual Studio Code Day 2024
This year’s VS Code Day explored GitHub Copilot, building generative AI apps, C# development with VS Code, and more. Watch the event on demand for loads of demos and tips that will help elevate your dev workflow.

Python Data Science Day
At the recent Python Data Science Day event, subject matter experts led sessions aimed at beginner to intermediate developers. Watch sessions on demand to learn about data science with VS Code and Python.

Cloud Skills Challenge: Azure Developer
Join the Azure 30 Days to Learn It Challenge! In less than 30 hours, you'll get hands-on experience with Azure services, learning how to store data and create serverless apps.

Cloud Skills Challenge: Java Apps on Azure
Take your Java skills to the cloud! Join the Java Apps on Azure Cloud Skills Challenge and learn how to build, migrate, and scale your Java-based apps for the cloud. Sign up today.

Build Intelligent Apps
Start building intelligent apps. Discover how to combine AI, cloud-scale data, and cloud-native app development to create powerful experiences that are adaptive, responsive, and personalized. Join this learning journey to get started.

Azure Developers: .NET Day 2024
Watch the recent event on demand. Build for the cloud, stay ahead of the curve, and maximize developer productivity.

Cloud Skills Challenge: .NET Azure Dev Challenge
Jump into the .NET Azure Dev Challenge for hands-on learning and some friendly competition. Benchmark your progress as you build new skills. This challenge is active from April 30 to May 17.

 

Videos

Intro to C# with GitHub Copilot in Visual Studio 2022
Learn how to use GitHub Copilot as a pair programming partner in your C# projects. Watch a demo that runs through a simple example using LINQ.

Making the right choice: Copilot Plugins vs. Graph Connectors
Bring business and app data into Copilot for Microsoft 365. Watch to learn how and find out whether plugins or Graph connectors are right for you.

Five-part LLMOps instructional video series
Build and monitor your own LLMOps. Watch this 5-part video series to watch demos and explore the capabilities in Azure AI Studio. Learn about model catalog, MaaS, prompt flow, AI Search, evaluation, and monitoring.

Supercharge your custom copilot in Microsoft Teams with Azure AI Vision and automation
Supercharge your custom copilot in Microsoft Teams with Azure AI Vision and automation. Watch this short video to learn how in less than 5 minutes.

Where do I start with Microsoft 365? Teams apps in Teams, Outlook, and Copilot for Microsoft 365
Want to build apps for Microsoft 365 but don’t know where to start? This video will help. Take your first steps and learn how the Microsoft Teams app model is used. 

 

Blogs

Microsoft for Startups blog: Unlocking visual storytelling with AI
The Microsoft for Startups blog takes a look at Linum. Discover how this startup is helping storytellers to create animations without the need for expensive software or hardware, creating video footage from descriptive text.

Duolingo makes learning language fun with help from AI
Duolingo, the language-learning app, uses AI to enhance the learner experience. Now Duolingo is sharing technology and engineering decisions that helped in its success. Learn more on the Microsoft for Startups blog.

Harness any infrastructure-as-code framework in Azure Deployment Environments
Streamline app infrastructure provisioning. Use any Infrastructure-as-Code framework to deploy app infrastructure with the new extensibility model in Azure Deployment Environments. Read the blog for details. 

Expand your app’s capabilities and reach on Microsoft Teams using API-based message extensions
Now available, API-based message extensions offer the easiest way to integrate your app into Microsoft Teams. Learn about message extensions and find out how to expand the reach and capabilities of your app.

Dev Proxy v0.16 with simulated handling Teams Admin Center notifications for Microsoft Graph connectors
Focus on developing your app, not on things that won’t go into production. Dev Proxy v0.16 boasts new features. Simulate notifications for Microsoft Graph connectors, simulate webhooks, and more. Read the blog for details.

Read the whole story
alvinashcraft
17 minutes ago
reply
West Grove, PA
Share this story
Delete

ASP.NET Core API Versioning

1 Share

Introduction

When you have an existing deployed REST API that you want to change, you generally need to be very careful. Why would you want to change it? There can be a number of reasons:

  • Changes to the request contract (adding, removing or modifying request parameters)
  • Changes to the response contract (same as above)
  • Changes to the behaviour of the existing actions (they now do something slightly different)
  • Changes to the HTTP methods that the existing actions accept (not so common)
  • Changes to the HTTP return status codes

In a controlled environment, you can change at the same time the server and the client implementations, which means that you can update your clients accordingly, but, unfortunately, this is not always the case. Enter versioning!

With versioned APIs, you can have multiple versions of your API, one that the legacy clients will still recognise and be able to talk to, and possibly one or more that are more suitable for the new requirements: each client requests from the server the version it knows about, or, if one is not specified (legacy clients may not know about this), a default one is returned. Fortunately, ASP.NET Core MVC (and minimal API too, but that will be for another post) supports this quite nicely, meaning, it can let you have multiple endpoints that map to the same route but respond each to a different version. Let’s see how!

What is a Version?

In this context, a version can be one of three things:

  • A floating-point number, which consists of a major and a minor version, such as 1.0, 2.5, 3.0, etc
  • A date, such as 2024-05-01
  • A combination of both, such as 2024-05-01.1.0

A version can also have a status, which can be a freeform text that adds context to the version, and is usually one of “alpha”, “beta”, “pre”, etc. Versions are implemented by the ApiVersion class.

Throughout this post, for the sake of simplicity, I will always be referring to versions as floating point numbers.

Setting Up

We will need to add Nuget packages Asp.Versioning.Mvc, Asp.Versioning.Mvc.ApiExplorer, and Swashbuckle.AspNetCore (for Swashbuckle, more on this later) to your ASP.NET Core project. Then, on Program class, we need to register a few things:

    builder.Services.AddApiVersioning(options =>
{
options.AssumeDefaultVersionWhenUnspecified = true;
options.DefaultApiVersion = new ApiVersion(1, 0); //same as ApiVersion.Default
options.ReportApiVersions = true;
options.ApiVersionReader = ApiVersionReader.Combine(
new UrlSegmentApiVersionReader(),
new QueryStringApiVersionReader("api-version"),
new HeaderApiVersionReader("X-Version"),
new MediaTypeApiVersionReader("X-Version"));
})
.AddMvc(options => {})
    .AddApiExplorer(options =>
{
options.GroupNameFormat = "'v'VVV";
        options.SubstituteApiVersionInUrl = true;
    });

Let’s start with AddApiVersioning: we are here telling it that the default API version (DefaultApiVersion) should be 1.0 and that it should assume it when a specific version is not specified (AssumeDefaultVersionWhenUnspecified). ASP.NET Core should also report all API versions that we have (ReportApiVersions), something that you may consider turning off in production. So, and how does ASP.NET Core infer the version that we want? Well, this is left in charge to the ApiVersionReader. We have a number of possible strategies, and, here, we are combining them all. They are:

  • UrlSegmentApiVersionReader: extracts the API version from the URL path, for example, /api/v3/Product; this is one of the most common strategies
  • QueryStringApiVersionReader: in this case, the version comes from a query string parameter, for example, ?api-version=3; good mainly for testing purposes, IMO
  • HeaderApiVersionReader: headless clients can send the desired version as a request header, for example: X-Version: 2. Also a must-have
  • MediaTypeApiVersionReader: this one is not so commonly used, but it allows to retrieve the version from the Accept or Content-Type request header fields, as in: Accept: application/json; X-Version=1.0; from my experience, it is rarely used, but, if you need more information, check out the guidelines

Of course, you can come up with your clever way of extracting the version from the request, all you need is to roll out your own implementation of IApiVersionReader. Here, the choice of parameters (api-version, X-Version) is obviously up to you, but I used the most common ones for each strategy. Usually, UrlSegmentApiVersionReader is what you want, together with the default version, for when you don’t specify a version on the URL.

Let’s have a look now at AddApiExplorer. Here, essentially, we are specifying how to name our versions (GroupNameFormat), in this case, starting with a v and consisting of the major, optional minor version, and status (VVV), this is essentially for OpenAPI clients such as Swashbuckle of which we’ll talk soon (see here for the full reference of the format string). We’re also telling it to substitute any {version} parameter it sees in route templates for the actual version (SubstituteApiVersionInUrl), for example, api/v{version}/[controller].

Versioning Controller Actions

We have essentially three ways to version our actions:

  • Using attributes
  • Using code
  • Using conventions

Using Attributes

Let’s first have a look at how to use attributes to version some controller actions. One decision that needs to be made is: should we have our different version action methods on the same controller or separate them? If we are to have them on the same controller, we can come up with something like this:

[ApiController]
[Route("api/[controller]")]
[Route("api/v{version:apiVersion}/[controller]")]
[ApiVersion("1.0")]
public class ProductController : ControllerBase
{
    [HttpGet]
    public Data Get()
    {
//not important here
}

    [HttpGet]
    [ApiVersion(2.0, Deprecated = true)]
    public ExtendedData GetV2()
    {
//not important here
    }
}

In this particular example, we are declaring two routes templates to our controller:

  • One without a version (api/[controller])
  • Another one with a required version specified (api/v{version:apiVersion}/[controller])

The first one just says: hey, if no version is specified and the Product controller is requested, here it is. Because we are applying the [ApiVersion] attribute to the controller class, its Get method will default to the version specified in it, whereas GetV2 will honour the specific version attribute that is applied to it. Worth noting that we can specify a version either as a string (“1.0”) or as a floating-point number (2.0). Just for completeness, we are also marking GetV2 with the Deprecated flag, which is something that we can leverage later on to extract information from the version, but does not affect in any way how the version is called, or that it can be called. For completeness, if we apply the [Obsolete] attribute, we get exactly the same result, from a metadata point of view. The usage of apiVersion to constraint the value of {version} is optional, it’s just there to prevent us from entering something on the URL that doesn’t make sense as a version.

Yet another option would be to declare new versions on a new controller class, probably the most typical case:

[ApiController]
[ControllerName("Product")]
[Route("api/v{version:apiVersion}/[controller]")]
[ApiVersion("3.0")]
[ApiVersion("4.0")]
public class ProductVXController : ControllerBase
{
    [HttpGet]
    [MapToApiVersion("3.0")]
    public ExtendedData GetV3()
    {
      //not important here
    }

    [HttpGet]
    [MapToApiVersion("4.0")]
    public ExtendedData GetV4()
    {
//not important here
    }
}

Notice that here we are skipping the default route template and only add the one with the version. We applied a [ControllerName] attribute to the class because we want it to also be called Product, and not ProductVX. Also, each action method takes its own [MapToApiVersion] attribute, this is how ASP.NET Core knows which one to call when one version is specified. Just out of curiosity, you can also have multiple versions pointing to the same method:

    [HttpGet]
    [MapToApiVersion("3.0")]
[MapToApiVersion("4.0")]
    public ExtendedData GetV34(ApiVersion version)
    {
//use version to detect the version that was requested
if (version.MajorVersion == 3) { … }
//not important here
    }

Here, we are injecting a parameter of type ApiVersion, which is automatically supplied by ASP.NET Core, and it allows us to retrieve the currently-requested version (in this example, it could either be 3.0 or 4.0).

Finally, we can also declare a controller (or an action method) to be version-neutral, this is achieved by applying an [ApiVersionNeutral] attribute. Essentially, what it means is that the controller/action accepts any version, or no version at all. If you are using a version-neutral API, you can, however, force it to think that it is using the default version, by setting the AddApiVersionParametersWhenVersionNeutral option in the configuration:

    .AddApiExplorer(options =>
    {
     options.GroupNameFormat = "'v'VVV";
        options.SubstituteApiVersionInUrl = true;
options.AddApiVersionParametersWhenVersionNeutral = true;
    });

From my experience, I never had to use version-neutral APIs, so I think it’s safe to ignore them.

Using Code

You can also use code to configure all of the aforementioned settings:

    .AddMvc(options =>
    {
      options.Conventions.Controller<ProductController>()
         .HasDeprecatedApiVersion(2.0)
            .HasApiVersion(1.0)
            .Action(c => c.GetV2()).MapToApiVersion(2.0)
            .Action(c => c.Get());

     options.Conventions.Controller<ProductVXController>()
            .HasApiVersion(3.0)
            .HasApiVersion(4.0)
            .Action(c => c.GetV3()).MapToApiVersion(3.0)
            .Action(c => c.GetV4()).MapToApiVersion(4.0);
})

As you can see, it’s very simple to understand, as the API is strongly typed.

Using Conventions

The last option we will explore for declaring API versioning is by using conventions. Out of the box, ASP.NET Core versioning only offers one option: to version by namespace name. This is implemented by class VersionByNamespaceConvention, and what it does is, depending on the namespace where our API controller is, it infers its version (nothing is known about individual action methods). So, for example (example adapted from the documentation here):

Namespace Version
Contoso.Api.Controllers 1.0 (default)
Contoso.Api.v1_1.Controllers 1.1
Contoso.Api.v0_9_Beta.Controllers 0.9-Beta
Contoso.Api.v20180401.Controllers 2018-04-01
Contoso.Api.v2018_04_01.Controllers 2018-04-01
Contoso.Api.v2018_04_01_Beta.Controllers 2018-04-01-Beta
Contoso.Api.v2018_04_01_1_0_Beta.Controllers 2018-04-01.1.0-Beta

The way to register this convention is like this:

    .AddMvc(options =>
    {
        options.Conventions.Add(new VersionByNamespaceConvention());
    })

If you want, you can implement your custom convention by implementing IControllerConvention yourself and register it here. The way you’d do is, you’d probably try to make assumptions from the namespace, name of the action methods, and the likes, and add attributes to the model. That is, however, outside the scope of this post.

Using Versioning

So, after everything is setup, we can now call our versioned APIs and see that, depending on the version we call, different endpoints are reached. If we enabled the ReportApiVersions option, we get the list of available versions, together with the deprecated ones, in the Api-Supported-Versions and Api-Deprecated-Versions headers, when we call an API:

report

Again, this may be considered a security issue so you may want to disable it in production!

Using Swagger

Swagger is a very popular spec and framework for generating API documentation compliant with the OpenAPI standard. Swashbuckle is a .NET package that generates Swagger/OpenAPI interfaces.

In the first section, we already added the required package Swashbuckle.AspNetCore, now we need to configure a few things; add this to the Program class:

builder.Services.AddSwaggerGen();
builder.Services.ConfigureOptions<ConfigureSwaggerOptions>();

app.UseHttpsRedirection(); //should already be there
app.UseSwagger();

//rest goes here

app.MapDefaultControllerRoute(); //should already be there, or a similar line

app.UseSwaggerUI(options =>
{
var provider = app.Services.GetRequiredService<IApiVersionDescriptionProvider>();
     foreach (var description in provider.ApiVersionDescriptions)
    {
options.SwaggerEndpoint($"/swagger/{description.GroupName}/swagger.json", description.GroupName.ToUpperInvariant());
    }
});

We register the Swagger required services and endpoints and also an options class for, ConfigureSwaggerOptions. Here is the definition of the ConfigureSwaggerOptions class:

internal class ConfigureSwaggerOptions : IConfigureOptions<SwaggerGenOptions>
{
private readonly IApiVersionDescriptionProvider _provider;
    private readonly IOptions<ApiVersioningOptions> _options;

    public ConfigureSwaggerOptions(IApiVersionDescriptionProvider provider, IOptions<ApiVersioningOptions> options)
    {
      _provider = provider;
        _options = options;
    }

    public void Configure(SwaggerGenOptions options)
    {
        foreach (var description in _provider.ApiVersionDescriptions)
        {
            var xmlFile = $"{Assembly.GetExecutingAssembly().GetName().Name}.xml";
            var xmlPath = Path.Combine(AppContext.BaseDirectory, xmlFile);

            if (File.Exists(xmlPath))
            {
                options.IncludeXmlComments(xmlPath);
            }

            options.SwaggerDoc(description.GroupName, CreateVersionInfo(description));
        }
    }

    private OpenApiInfo CreateVersionInfo(ApiVersionDescription description)
    {
      var info = new OpenApiInfo()
        {
            Title = $"API {description.GroupName}",
            Version = description.ApiVersion.ToString()
        };

        if (description.ApiVersion == _options.Value.DefaultApiVersion)
        {
            if (!string.IsNullOrWhiteSpace(info.Description))
            {
                info.Description += " ";
            }

            info.Description += "Default API version.";
        }

        if (description.IsDeprecated)
        {
            if (!string.IsNullOrWhiteSpace(info.Description))
            {
                info.Description += " ";
            }

            info.Description += "This API version is deprecated.";
        }

        return info;
    } }

This IConfigureOptions<T> implementation class is part of an implementation of the Options Pattern; essentially, Swashbuckle uses the SwaggerGenOptions class to store configuration and we use this class to modify it before it is actually used. What we do here is, we look at each API endpoint and we document it, we set the title of the API, the version, and we say whether or not it is deprecated or the default, plus we add the XML documentation to the API, if it exists as a XML file. This is a bit outside the scope of this article, but you can output the code comments on your API to an XML file, which can then be used to document the REST API through Swagger.

Once we do this, we can now access the /Swagger endpoint and see all the available API endpoints:

swagger

Notice on the top right the list of API versions that we can choose from. For example, if we pick V2:

swagger-deprecated

Swashbuckle allows us to call an endpoint of a specific version without caring about the details, how we actually set the version, which comes very useful.

Conclusion

API versioning is a powerful tool that you should leverage to avoid compatibility issues between your APIs and your clients. It allows your systems to evolve while maintaining compatibility with legacy clients. As always, looking forward for your comments!

See https://github.com/dotnet/aspnet-api-versioning for examples and https://github.com/dotnet/aspnet-api-versioning/wiki for a discussion of these topics.

Read the whole story
alvinashcraft
18 minutes ago
reply
West Grove, PA
Share this story
Delete

Two Track Transformation for Innovation in the Age of AI

1 Share

Two Track Transformation for Innovation

“You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.” ― Buckminster Fuller

Leaders face a perpetual challenge when it comes to leading innovation.

You have to “run the business” while you “change the business.”

As head coach for Satya Nadella’s innovation team, I needed a way to explain how innovation fits in.

I call my mental model Two-Track Transformation because it is the best metaphor I found.

It helps leaders see how they can “run the business” while they “change the business.”

The Power of the Right Mental Model for Innovation

I’m on a mission to change how the world innovates.  I learned a lot at Microsoft and beyond and see a path to fruition.

To change how the world innovates, I need to empower innovation at 3 levels:

  1. Individual
  2. Team
  3. Organization

I found the biggest challenge is at the organization level because of culture and mindsets.

I learned from an anthropologist what separates the best leaders from the rest:

“They share their mental models.”

Wow.  That’s really it.

And I’ve seen leaders of orgs of tens of thousands of people (and more) fail to share their mental model around innovation.

When they don’t share their mental model of innovation, that’s why orgs spin up, get shut down, compete with each other, or fail to succeed (or succeed despite the org.)

Two Track Transformation for Innovation (the New Mental Model for Modern Innovation)

Two-Track Transformation for Innovation empower simultaneous growth through disruptive innovation and stability through sustaining innovation.

Here is how I whiteboard and walk through innovation when I explain it to leaders or CEOs:

Two Track Transformation for Innovation

The two tracks are in parallel:

  1. Track #1: Your Current Business:  This is your sustaining innovation track.  This is where you work on your current business model, including your current customers and current products and services.   This is where you “run the business.”
  2. Track #2: Your Future Business Today:  This is your disruptive innovation track.  This is where you work on your Future Business now.    You don’t work on your future business in the future.  You work on it today through small business experiments to validate value for the future.

What Two Track Transformation for Innovation Helps You See

I’ve found this model to be super simple, super helpful and it is instant impact.

Suddenly, smart leaders see how their lack of a mental model, or the wrong metaphor got in the way of effective innovation.

They see what sustaining vs. disruptive innovation looks like.

They see how their current organization created the Innovator’s Dilemma.

They see how their current KPIs for their current business destroy their innovation.

They instantly see how they can work on their future business now, while they work on their current business.

Sustaining + Disruptive Innovation

Two Track Transformation for Innovation combines Sustaining + Disruptive Innovation in parallel:

Sustaining + Disruptive Innovation

Most leaders struggle with “innovation” because they lack clarity around the concepts of “sustaining” vs. “disruptive” innovation.

Sustaining Innovation:

  • Sustaining innovation is simply an improvement to an existing product or service.
  • This is where you make your good products better for your most sophisticated and demanding customers.
  • This is where big companies tend to focus because they find their best profits.
  • And this is what actually creates space for new entrants and disruptors.

Disruptive Innovation:

  1. Low-End Disruption
  2. New-Market Disruption

The big difference between the low-end disruption and new-market disruption is that low-end disruption focuses on over-served customers, while new-market disruption focuses on underserved customers.

How Two Track Transformation for Innovation Solves the Innovator’s Dilemma

The Innovator’s Dilemma is that successful companies struggle to adopt disruptive innovations because they threaten their core business and existing customer base.

The two-track transformation approach directly addresses the Innovator’s Dilemma by allowing established companies to pursue disruptive innovation without cannibalizing their existing business.

Here’s how the two-track approach helps:

  1. Separate Focus: By having distinct tracks for sustaining and disruptive innovation, you can dedicate resources and strategies to each without hindering the other.
  2. Different Metrics: Different KPIs allow you to measure success for each track appropriately. Disruptive innovation can focus on user growth and market creation, while sustaining innovation maintains profitability and customer satisfaction for existing products.
  3. Organizational Agility: The two-track model encourages a more flexible and adaptable organizational structure. Disruptive innovation can be pursued through separate teams or even subsidiaries, allowing them to operate with a startup mentality without being constrained by the established company’s processes and priorities.

Benefits of Two Track Transformation for Innovation

The Two-Track Transformation model offers several incredible benefits for you as you figure out the ever-changing innovation landscape.

Here are three of the most significant:

  1. Simultaneous Growth and Stability: The two-track approach allows you to pursue growth through disruptive innovation while maintaining stability through sustaining innovation. This two-pronged attack helps you hedge your bets and ensures you’re not solely reliant on existing products and services. Disruptive innovation has the potential to unlock entirely new markets and revenue streams, while sustaining innovation keeps the core business healthy and profitable.
  2. Reduced Risk of Disruption: By actively participating in disruptive innovation, you become less susceptible to being disrupted by external forces. You gain a deeper understanding of emerging technologies and market trends, allowing you to adapt your existing offerings or create new ones that address future needs. Engaging in disruptive innovation is essentially a proactive strategy for staying ahead of the curve and maintaining a competitive edge.
  3. Enhanced Innovation Culture: The two-track model fosters a more dynamic and innovative company culture. It encourages employees to think outside the box, experiment with new ideas, and embrace a “fail fast, learn faster” mentality. Having a dedicated track for disruptive innovation creates a safe space for experimentation and allows employees to champion innovative ideas that might not necessarily be suitable for the existing business model. This cultivates a culture of continuous learning and adaptation, crucial for your long-term success in today’s rapidly evolving markets.

The Two-Track Transformation model provides a framework for leaders to not only survive but thrive in the face of disruption.

It allows you to balance the need for immediate success with the vision for long-term growth, ultimately leading to a more sustainable and future-ready business model.

Challenges with Existing Mental Models for Innovation

“All models are wrong, some are useful.”George Box

The problem with existing models and metaphors for innovation is they create the wrong mental model.

Existing models for innovation tend to put innovation either out there, in the future, or all at once.

All the models are helpful in some way, yet I find myself having to explain too much.

Here are 3 common models for handling the challenge of addressing Current Business vs. Future Business:

  1. McKinsey’s 3 Horizons of Growth
  2. 3 Box Solutions by Vijay Govindrajan
  3. Pivot to the Future by Omar Abbosh, Paul Nunes and Larry Downes

They are all great pieces of work.  They have all helped me and many other leaders in some way.

And yet, even with the models, I still found myself struggling to explain to leaders how to fit innovation in their organization.

Let’s take a quick look at each model so we understand the challenges…

McKinsey’s 3 Horizons of Growth

McKinsey 3 Horizon Model

  • Early on it was a great model for addressing the Innovator’s Dilemma.
  • Orienting around horizons made a lot of sense because of the lag in the market.
  • Over time, this metaphor got in the way, as market change and disruption got faster.
  • People like Steve Blank have done really good constructive criticism of the model.
  • Many leaders use the model but get it wrong because they never read Alchemy of Growth.
  • I still use the model, but it’s not the right metaphor when I need to position how innovation fits in.

3 Box Solution by Vijay Govindrajan

3 Box Solution

  • I really enjoyed this pragmatic approach by Vijay Govindrajan in his book, The 3 Box Solution.
  • Ironically, The Box Solution helped me better understand McKinsey’s 3 Horizons of Growth
  • Again, while it was helpful for “boxing up” the Past, the Present, and the Future, I found myself explaining too much.

Pivot to the Future by Omar Abbosh, Paul Nunes and Larry Downes

Pivot to the Future

  • This felt to me like a fun spin on The 3 Box Solution.
  • I like how Pivot to the Future reinforces the idea of the Old, the Now, and the New.
  • I found it helpful for talking in some situations about the challenge of innovation.
  • But again I found myself explaining too much, and I was still missing a simple metaphor to explain how innovation fits in.

Don’t Lead Innovation with a One-Track Mind

Two-Track Transformation for Innovation elegantly balances your need for disruptive and sustaining innovations within your organization.

By taking a dual-path approach, you can explore groundbreaking ideas while enhancing your core offerings, ensuring both agility and continuity in your market presence.

This strategy not only secures your competitive edge but also stabilizes your business during volatile times, making it resilient against industry shifts.

Implementing this model ensures that innovation becomes a sustainable practice, not just sporadic bursts of creativity.

Ultimately, this Two Track Transformation for Innovation approach equips leaders to thrive in the Age of AI by being proactive and responsive, give you the best of both worlds in innovation.

You Might Also Like

Innovation Hub
All the Big Ideas of Innovation on a Page
Amazon Leadership Principles for Innovation and Impact
Best Innovation Books
How To Become an Innovator
How To Use the Double Diamond Method to Innovate Better
How To Be Creative at Any Age
How To Use Imagine If to Innovate Better
What is Innovation?

 

 

The post Two Track Transformation for Innovation in the Age of AI appeared first on JD Meier.

Read the whole story
alvinashcraft
18 minutes ago
reply
West Grove, PA
Share this story
Delete

Daily Reading List – May 1, 2024 (#309)

1 Share

I had a very enlightening day of leadership meetings and am very bullish on the things Google Cloud is working on for customers. Energy level is high around here! I started my day with lots of reading, which you’ll find below.

[article] Retail banking turns to core modernization as cloud strategies mature. It’s time to finish some of those modernizations, people! These small, incremental approaches take too long, and it’s holding up future progress.

[blog] GCP Data Engineering Project: Streaming Data Pipeline with Pub/Sub and Apache Beam/Dataflow. Big post showing off a complete solution for aggregating messages in a streaming architecture.

[article] MongoDB aims to jumpstart AI app development with MAAP. More tools and professional services on the way to help folks build generative AI apps.

[blog] How Konfig provides an enterprise platform with GitLab and Google Cloud. I’d suspect that if you’re reading my post, you probably have a source control system in place. But if not, or if you need an upgrade, I like what Real Kinetic is doing to make it easier to set up an enterprise-grade deployment.

[article] AI still has a ways to go in code refactoring. Readability and maintainability matter as much (more than?) coding speed, and Matt points out the need to supervise what AI is generating.

[blog] Supercharged Developer Portals. Good for Spotify for commercializing their open source tech and making developer portals easier to set up and use.

[article] AI, Your Task: Create Autonomous Agents. Vik (with help from AI) wrote this piece about “foundation agents” that learn and adapt to their environments.

[article] Who Takes a Risk on New Technology? That new technology won’t take off if there aren’t people willing to make personal bets on it. This article starts with a story about directors in Hollywood, and connects it to technology adoption.

[article] Java 17 is most-used LTS version of Java – report. It’s very good to see that Java 8 is finally falling out favor with Java devs.

##

Want to get this update sent to you every day? Subscribe to my RSS feed or subscribe via email below:



Read the whole story
alvinashcraft
18 minutes ago
reply
West Grove, PA
Share this story
Delete

How “AI” is helping my writing process

1 Share

People often talk about how AI will kill us, and take our jobs. 

It may do all that, but for now, I think of it as technology that augments my capabilities, and I see this play out every day in my life as a writer. For instance, yesterday I interviewed Matthew Prince, co-founder of CloudFlare. The conversation ran for about an hour. I had an editable transcript in about three hours, ready for final finessing.

In the past, when I interviewed someone for about an hour, it would take a day for a transcription service to turn around the interview. It would cost me about $150. Or I would use an early AI-like app called Rev. Then, I would spend three days editing the interview and incorporating it into the story. When I was on tight deadlines, that would mean a sleepless night or two. I hoped that an editor would shape it into publishable copy.

Nowadays, the process is much faster because I use several AI tools. I can record my interview using a voice app on my iPhone, or use Zoom’s inbuilt feature. I import the audio file into MacWhisper, a great application – the best $10 I spent. ( I can do this with Descript, but I much prefer not paying for yet another AI software subscription, considering I already pay for Poe, Midjourney, and OpenAI.)

It uses the open-source Whisper protocol to transcribe the entire voice file on my MacBook. I can save the transcript as a PDF and use Anthropic’s Claude Opus to clean it up. I have the right editing prompts to do the clean-up job fast.

With the right prompts, I can get the interview to a place where I can start editing it manually. The entire process, which would have taken a few days in the past, now takes between two to three hours. To me, that is what this new augmented intelligence is all about helping me get the work done faster and more efficiently.

I’m not too fond of sleepless nights. I will let others lose sleep over AGI — while I can get to sleep longer, by finishing my writing sooner.

May 1, 2024. San Francisco

ALSO ABOUT AI & ME

Read the whole story
alvinashcraft
18 minutes ago
reply
West Grove, PA
Share this story
Delete

Harnessing .NET Source Generators to Boost Performance

1 Share
.NET 5 introduced Source Generators to automate C# source code generation during compilation. They enable automatic boilerplate code implementation, performance optimization, metaprogramming, and integration with external tools.





Read the whole story
alvinashcraft
19 minutes ago
reply
West Grove, PA
Share this story
Delete
Next Page of Stories