Skip to main content

Microsoft Infuses AI into DevOps Workflows

· 4 min read
Wout Van Doorselaer
Bronnen

Bron: artikel integraal overgenomen van devops.com
Origineel auteur: Mike Vizard

microsoft-build-2024

Microsoft this week added a bevy of tools to its portfolio that infuses generative artificial intelligence (AI) into DevOps workflows.

Unveiled at the Microsoft Build 2024 conference, those additions include GitHub Copilot for Azure which enables DevOps teams to build, troubleshoot and deploy applications on the Microsoft Azure cloud using a natural language interface.

GitHub Copilot for Azure is one of the multiple extensions that Microsoft and third-party partners are providing to, for example, customize tools from Docker, Inc. and Sentry. Microsoft is also embedding its Copilot into Microsoft Teams and Microsoft 365 development tools.

At the same time, Microsoft is previewing a Copilot for Azure that DevOps teams can use to orchestrate application deployments on Azure.

In addition, Microsoft Visual Studio 17.10 embeds GitHub Copilot directly into the integrated development environment (IDE) to provide diagnostic and code review capabilities. Microsoft also previewed updates to Azure Developer Command Line Interface and AI Toolkit for Microsoft Visual Studio Code that enable DevOps teams to integrate copilot sample repositories into DevOps workflows that can be extended to address large language model operations (LLMOps).

Microsoft CEO Satya Nadella told conference attendees that Microsoft is now applying the AI capabilities it provides to write code to manage IT infrastructure and IT operations. Microsoft is redefining software development as part of an effort to one day enable anyone to go from idea to code in an instant, he added.

Paul Nashawaty, practice lead for application development and modernization at the Futurum Group, noted Microsoft in effect is streamlining development processes in a way that promises to make developers 50% more efficient as they build AI applications.

As part of that effort, Microsoft is also making available reference architectures and implementation guidance for building and deploying applications infused with AI models on Azure.

Microsoft is also previewing a Microsoft Azure Compute Fleet service that simplifies the provisioning of compute resources across different types of virtual machines, including spot instances, running in multiple availability zones depending on cost, capacity and performance requirements. IT teams will be able to deploy and manage up to 10,000 virtual machines with a single call to an application programming interface (API) to manage resources that can now dynamically scale as needed.

There is also now an additional instance of an Azure ND MI300X v5 VM based on graphical processor units (GPUs) from AMD and Microsoft is also previewing a Cobalt 100 Arm-based virtual machine based on its custom silicon. Microsoft has also made available a migration tool for shifting instances of Linux operating systems to Azure.

Microsoft is also adding support for a sidecar capability through which DevOps teams can embed logging, monitoring and caching capabilities without having to change application code. DevOps teams can now also use Git repositories to manage the development of applications created using the Microsoft Power framework.

In addition, Microsoft is also adding support for Pulumi as another option for provisioning cloud infrastructure as code.

Finally, Microsoft also made an Azure API Center generally available to discover, manage and govern APIs and make it possible to import Azure OpenAI endpoints as APIs into the Azure API Management service that now also supports OData and gRPC APIs.

Organizations Increasingly Operationalize AI

It’s not clear to what degree DevOps teams are infusing AI into workflows. However, as the amount of code being written using generative AI tools increases it’s only a matter of time before they will need to revisit existing pipelines. Legacy continuous integration/continuous delivery (CI/CD) platforms were not designed for the AI era.

In addition, it’s only a matter of time before machine learning operations (MLOps) workflows are merged with DevOps workflows to accelerate the building and deployment of AI applications.

The only thing that remains to be seen is whether those changes will be made this year, or next, as organizations increasingly operationalize AI.