Microsoft Build 2024: 6 takeaways for developers, data professionals


With AI and generative AI becoming the dominant theme for most enterprise software vendors at their annual conferences, Microsoft’s 2024 version of Build stuck to the norm.

Microsoft CEO Satya Nadella took the stage to summarize his keynote right at the beginning to say that the event was all about Copilots and the Copilot stack across most of the company’s offerings.

At the event, Nadella introduced a host of updates to Microsoft’s cloud platform to make working with LLMs easier and added generative AI-based assistants to many of its offerings. Here are some key takeaways from the conference that are relevant for developers and data professionals:

Microsoft’s updates to AI infrastructure include new Azure virtual machines (VM), a new provisioning service, and options for enabling access to Microsoft Copilot in Azure.

In the VM space, Microsoft shared plans to release Azure VMs that can run on Cobalt 100 processors currently in preview. The Azure Cobalt 100 CPU, which is built on Arm architecture, was launched in November last year by Microsoft in an attempt to make its infrastructure across data centers more energy efficient when compared to commercial AMD and Intel CPUs.

Alongside the Azure VMs for Cobalt CPUs, Microsoft made AMD’s ND MI300X series of processors generally available for Azure. The ND MI300X VM, which combines eight AMD MI300X Instinct accelerators, will provide enterprises with better cost performance than rivals, especially for inferencing large language models, such as GPT-4, according to the company.

Microsoft also released a new provisioning service, dubbed Azure Compute Fleet, which simplifies the provisioning of Azure compute capacity across different VM types, availability zones, and pricing models.

Other updates include opening up Copilot in Microsoft Azure to all enterprise customers over the next couple of weeks. Copilot in Azure was introduced to help enterprise teams manage cloud and edge operations in natural language.

The Build conference saw Microsoft adding new LLMs, governance features to Azure AI, the company’s cloud-based platform for building and running AI applications. 

The new models added to the model catalog inside Azure AI Studio include OpenAI’s GPT-4o, showcased this week. Other models that have been added via Azure AI’s Models-as-a-Service (MaaS) offering include TimeGen-1 from Nixtla and Core42 JAIS, which are now available in preview. Models from AI21, Bria AI, Gretel Labs, NTT Data, Stability AI, and Cohere Rerank are expected to be added soon.

Microsoft has updated its Phi-3 family of small language models (SLMs) as well with the addition of Phi-3-vision, a new multimodal model that is expected to become available in preview.

The company also introduced new governance and safety features for Azure AI with it updating its model output monitoring system, Azure AI Content Safety.

The new feature, named Custom Categories, is currently in preview and will allow developers to create custom filters for specific content filtering needs. Other governance features added to Azure AI Studio and Azure OpenAI Service include Prompt Shields and Groundedness Detection, both of which are in preview.

Separately, Azure AI Studio, which is a generative AI application development toolkit that competes with the likes of Amazon Bedrock and Google Vertex AI Studio and was introduced in a preview during November last year, has been made generally available.

At the annual conference in Seattle, Microsoft also updated its Copilot Studio offering — a low-code tool for creating copilots — with agent-building capabilities.  

In addition, Microsoft said that it would be adding copilot connectors to Copilot Studio in order to simplify how developers connect their business and collaboration data to their copilots.

Copilot connectors include over 1,400 Microsoft Power Platform connectors, Microsoft Graph connectors, and Power Query connectors. Microsoft said that integration with Microsoft Fabric would be added soon.

The developer conference also saw Microsoft making several enhancements to Fabric, the company’s cloud-based suite of tools for data analytics. 

The additions and enhancements include a new Real-Time Intelligence module, a tool kit for customizing Fabric workflows, and the general availability of Copilot for Power BI.

While the Real-Time Intelligence module combines the analytics and activator workloads and offers additional features, such as a low-code interface, to help enterprises generate insights from real-time data, the Fabric Workload Development Kit is a tool kit designed to help developers build interoperable applications within Fabric.

Some of the other updates to Fabric include the addition of OneLake shortcuts to connect to data sources beyond just Azure Data Lake Service Gen2, and partnership with Snowflake to create full interoperability between Snowflake and OneLake.

Additionally, Microsoft said that enterprise users will be able to access Azure Databricks Unity Catalog tables directly in Microsoft Fabric in the coming months

Microsoft released multiple updates to its database offerings at its developer conference and one of the major updates is the addition of vector search to Azure Cosmos DB for NoSQL.

Other database updates include the availability of an Azure Database for PostgreSQL extension for Azure AI and the addition of Copilot capabilities to Azure SQL DB.

Microsoft introduced Copilot Extensions, third-party add-ons to the GitHub Copilot AI-powered coding assistant at its annual conference.

Copilot Extensions, as the name suggests, are extended capabilities of GitHub Copilot that help with specific databases, SDKs, or APIs in the software development workflow and are aimed at accelerating developer workflows without users having to exit the GitHub window or their IDEs.

Additionally, enterprises can create their own private Copilot Extensions while the GitHub Marketplace will offer extensions that are open to all.

Copyright © 2024 IDG Communications, Inc.



Source link