Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
147939 stories
·
32 followers

Becoming Frontier: How human ambition and AI-first differentiation are helping Microsoft customers go further with AI - The Official Microsoft Blog

1 Share

Over the past few years, we have driven remarkable progress accelerating AI innovation together with our customers and partners. We are achieving efficiency and productivity at scale to shape industries and markets around the world. It is time to demand more of AI to solve humanity’s biggest challenges by democratizing intelligence, obsolescing the mundane and unlocking creativity. This is the notion of becoming Frontier: to empower human ambition and find AI-first differentiation in everything we do to maximize an organization’s potential and our impact on society.

Microsoft’s technology portfolio ensures our customers can go further with AI on their way to becoming Frontier firms, using our AI Transformation success framework as their guide. Our AI business solutions are dramatically changing how people gain actionable insights from data — fusing the capabilities of AI agents and Copilots while keeping humans at the center. We have the largest, most scalable, most capable cloud and AI platform in the industry for our customers to build upon their aspirations. We remain deeply focused on ensuring AI is used responsibly and securely, and embed security into everything we do to help our customers prioritize cybersecurity and guard against threats.

We are fortunate to work with thousands of customers and partners around the world — across every geography and industry. I am pleased to share some of the customer stories being showcased at our recently opened Experience Center One facility — each exemplifying the path to becoming Frontier.

Driven by a commitment to innovation, sustainability and operational excellence, ADNOC is helping meet the world’s growing energy demands safely and reliably, while accelerating decarbonization efforts. To empower its workforce, the company introduced OneTalent — a unified AI-powered platform consolidating over 16 legacy HR processes into a single, intelligent system that furthers its dedication to nurturing talent, aligning people with strategic goals and turning every member of its workforce into an AI collaborator. Partnering with Microsoft and AIQ, ADNOC applied AI across its operations to reimagine everything from seismic analysis to predictive maintenance. ENERGYai and Neuron 5 — AI-powered platforms built natively on Azure OpenAI — turn complexity into actionable insights. The platforms use predictive models to reduce downtime — by as much as 50% at one plant. They are also using autonomous agents to optimize energy use; unlocking data-driven insights that have accelerated energy workflows from months or years to just days or minutes.

Asset manager and technology provider BlackRock has been on a journey to infuse AI to level up how its organization operates across three key pillars: how they invest, how they operate and how they serve clients. To accelerate this mission, they partnered with Microsoft to transform processes across the investment management lifecycle by integrating cloud and AI technologies alongside its Aladdin platform. Embedded across 20 applications and accessed by tens of thousands of users, the Aladdin platform’s AI capabilities deliver functionally relevant tools to help redefine workflows for different types of financial service professionals. Client relationship managers are saving hours per client, reducing duplication and improving accuracy by evaluating CRM and market data to generate personalized client briefs and opportunity analyses using natural language processing — supported by verification and review methods that facilitate accuracy and compliance. Investment compliance officers are streamlining portfolio onboarding and compliance guideline coding, saving time on more straightforward tasks to focus on complex, investigative tasks. Portfolio managers can access data, analytics, research summaries, cash balances and more through AI-powered chat capabilities; enabling faster, more informed decision-making aligned with client mandates. With accelerated insights, improved data quality and enhanced risk management, BlackRock and its clients gain an advantage while enhancing client service, compliance and portfolio management.

To build on its culture of innovation and enable hyper-relevant messaging at scale, multinational advertising and media agency dentsu built a cutting-edge solution using Azure OpenAI: dentsu.Connect — a unified OS for its applications. By leveraging the power of AI across the entire campaign lifecycle, clients can build and execute campaigns while predicting marketers’ next best impact with confidence and precision. This end-to-end platform drives data connectivity and ensures seamless interoperability with clients’ technology and data stacks to maximize and drive brand relevance across content, production and media activation while aligning every action with business goals. dentsu.Connect helps minimize the gap between insights and action with speed and precision. Since launching, users have increased operational efficiency by 25%, improved business outcomes by 30% and quickened decision-making and data-driven AI insight generation by 125X.

Water management solutions and services partner Ecolab is harnessing the power of data-driven solutions to enable organizations to reduce water consumption, maximize system performance and optimize operating costs. Using Microsoft Azure and IoT services, the company built ECOLAB3D: an intelligent cloud platform that unifies diverse and dispersed IoT data to visualize and optimize water systems remotely. By providing actionable insights for real-time optimization across multiple assets and sites, Ecolab partners with global leaders such as Microsoft to collectively drive hundreds of millions in operational savings — while conserving more than 226 billion gallons of water annually; equivalent to the drinking water needs of nearly 800 million people. Delivering solutions across diverse industries, Ecolab is also a trusted partner for foodservice locations, helping balance labor costs with customer satisfaction. Its cloud-based platform Ecolab RushReady transforms data into an AI-enabled dashboard that improves daily operations by delivering actionable insights. In an Ecolab customer case study, this helped improve speed of service and sales labor per hour, resulting in increased profit of more than 10%. From data centers to dining rooms, Ecolab delivers intelligent, scalable solutions that transform operations for greater efficiency and measurable impact.

Leveraging Microsoft’s AI solutions across its portfolio, Epic built agentic “personas” to support care teams and patients, improve operations and financial performance and advance the practice of medicine. By summarizing patient records and automatically drafting clinical notes, one organization found that “Art” decreased after-hours documentation for clinicians by 60%, reduced burnout by 82% and helped them focus more on patient care. Care teams can also track long-term patient health and better plan treatment for chronic conditions, while nurses can perform wound image analysis automatically with 72% greater precision than manual methods. At one hospital, AI review of routine chest X-rays led to earlier discovery of over 100 cases of lung cancer, increasing the detection rate to 70% compared to the 27% national average. To support back-end operations, organizations are using “Penny” to improve the revenue cycle — resulting in $3.4 million in additional revenue at one regional network services provider. Epic also developed “Emmie” to have conversational interactions with patients and more easily help them schedule appointments and ask questions. Epic is leveraging Azure Fabric for the Cosmos platform to bring together anonymized data from more than 300 million patients, including 13 million with rare diseases, so physicians can connect with peers who have treated similar cases to improve rare disease diagnosis and select the most effective treatment.

To reduce professional burnout and accelerate scale across the industry, Harvey built an AI platform to automate legal research, contract reviews and document analysis. Harvey Assistant assists attorney searches across large document sets to identify specific clauses or provisions within seconds instead of hours. To support large-scale analysis, Harvey Vault manages and analyzes up to 100,000 files per project for complex tasks like litigation, while Harvey Workflows automates routine yet critical tasks into smaller AI-managed steps. With the integration of the newly expanded Microsoft Word add-in, AI capabilities provide legal teams with the ability to edit 100-plus page documents with a single query, enabling centrally controlled document compliance reviews that enhance efficiency while reducing risk. With more than 74,000 legal professionals using the platform, Harvey is helping them streamline workflows, reduce administrative burden and combat attorney fatigue — with the average user saving up to 25 hours of time per month.

To revolutionize drug discovery, biotech company Insilico Medicine is leveraging AI across its entire development pipeline — from target identification to molecule design and clinical trials. The company created Pharma.AI to accelerate research while reducing costs and improving success rates in emerging novel therapies — with developmental candidate timelines reduced from 2.5-4.5 years to 9-18 months for more than 20 therapeutic programs. The integrated AI platforms built with Azure AI Foundry manage complex biological data, identify disease-relevant targets and advance candidates to clinical trials — accelerating research in what is traditionally a slow, costly and complex pharmaceutical R&D process. They enable researchers to analyze genetic data and identify drug targets with AI-generated reports to facilitate business case development; use physics-based models to evaluate candidates for potency, safety and synthesizability; integrate with specialized large language models for drug discovery; and combine AI agents with structured workflows to reduce document drafting time by over 85% while improving first-pass quality of scientific documents by 60%.

To enhance manufacturing operations in a fast-paced and complex industry, global consumer foods producer Kraft Heinz partnered with Microsoft to embed AI and machine learning across its production facilities, resulting in smarter decision-making and operational improvements. The company built an AI-powered platform — Plant Chat — providing real-time insights on the factory floor and reducing downtime to enable faster, more confident decision-making with proactive guidance. The solution analyzes over 300 variables and allows operators to interact via natural language to improve consistency, reduce guesswork, decrease waste and maintain compliance — even for less experienced operators. Since implementation and collectively with other initiatives, these efforts have resulted in a 40% reduction in supply-chain waste, a 20% increase in sales forecast accuracy and a 6% product-yield improvement across all North American manufacturing sites through the third quarter of 2024. Combined with further operational improvements, this work has yielded more than $1.1 billion in gross efficiencies from 2023 through the third quarter of 2024.

To redefine work and scale intelligent automation globally, digital native Manus AI developed an advanced autonomous AI system designed to understand user intent and execute complex workflows independently across various domains. The solution leverages a multi-agent architecture through Microsoft Azure AI Foundry to deliver scalable, versatile task automation for millions of users worldwide. Its Wide Research capability deploys specialized sub-agents to rapidly perform large-scale, multi-dimensional research tasks; saving significant time and delivering actionable insights to make complex analysis accessible and efficient for strategic decision-making. Manus AI can also build dynamic dashboards so organizations can visualize trends, anomalies and market insights in real-time; driving strategic planning with reliable, up-to-date information. The multimodel image editing and creation capabilities also allow users to support brand consistency and enable marketers and product teams to iterate rapidly.

To advance automotive innovation, stabilize supply chain volatility, simplify production complexity and meet sustainability demands, Mercedes-Benz scaled AI innovation across its global production network. The MO360 data platform connects over 30 car plants worldwide to the Microsoft Cloud, enabling real-time data access, global optimization and analytics. The Digital Factory Chatbot Ecosystem uses a multi-agent system to empower employees with collaborative insights, and Paint Shop AI leverages machine learning simulations to diagnose efficiency declines and reduce energy consumption of the buildings and machines — including 20% energy savings in the Rastatt paint shop. Using NVIDIA Omniverse on Azure, Mercedes-Benz created large-scale factory digital twins for visualization, testing and optimization of production lines — enabling agile planning and continuous improvement. The MBUX Virtual Assistant embedded in over 3 million vehicles, powered by Microsoft’s ChatGPT and Bing Search, offers natural, conversational voice interactions and integrates Microsoft 365 Copilot with Teams directly into vehicles to enable mobile workspaces.

U.S. stock exchange and financial services technology company Nasdaq integrated AI capabilities into its Nasdaq Boardvantage platform to help corporate governance teams and board members save time, reduce information overload, improve decision-making and enhance board meeting preparation and governance workflows. The board management platform is used by leadership teams at over 4,000 organizations worldwide to centralize activities like meeting planning, agenda building, decision support, resolution approval, voting and signatures. Using Azure OpenAI GPT-4o mini, the AI Summarization feature helps board secretaries significantly reduce manual effort, saving hundreds of hours annually with accuracy between 91% to 97%. AI Meeting Minutes helps governance teams draft minutes by processing agendas, documents and notes while allowing for customization of length, tone and anonymization; accelerating post-meeting workflows and saving up to five hours per meeting.

As customers seek to use AI more to shop and search for products, luxury lifestyle company Ralph Lauren developed a personal, frictionless, inspirational and accessible solution to blend fashion with cutting-edge AI. Working with Microsoft, Ralph Lauren developed Ask Ralph: an AI-powered conversational tool providing styling tips and outfit recommendations from across the Polo Ralph Lauren brand. Powered by Azure OpenAI, the AI tool uses a natural language search engine to adapt dynamically to specific language inputs and interpret user intent to improve accuracy. It supports complex queries with exploratory or nuanced information needs with contextual understanding; and can discern tone, satisfaction and intent to refine recommendations. The tool also picks up on cues like location-based insights or event-driven needs. With Ask Ralph, customers can now reimagine how they shop online by putting the brand’s unique and iconic take on style right into their own hands.

Industrial automation and digital transformation expert Rockwell Automation is integrating AI and advanced analytics into its products to help manufacturers adapt seamlessly to market changes, reduce risk and develop agentic AI capabilities to support innovation and growth. FactoryTalk Design Studio™ Copilot, a cloud-based environment for programming, enables rapid updates to code for evolving production needs — reducing complex coding tasks from days to minutes. Rockwell’s digital twin software, Emulate3D®, creates physics-based models for virtual testing of automation code and AI, reducing costly real-world errors and production risks while cutting on-site commissioning times by 50%. With the integration of NVIDIA Omniverse — a collaborative, large-scale digital twin platform — users can perform multi-user factory design and testing to facilitate cross-disciplinary collaboration, address industry challenges and unlock opportunities through digital simulation before real-world deployment.

To enable a cleaner, more resilient energy future, Schneider Electric is powering AI-driven industry innovation by addressing grid stability and enterprise sustainability challenges. Built using Microsoft Azure, the company developed solutions for organizations to act faster and smarter while delivering measurable improvements in grid reliability and enterprise ESG management. Resource Advisor Copilot transforms raw ESG and energy data into actionable insights via natural language queries to support knowledge-based and system data questions; saving sustainability managers hundreds of hours annually in data analysis and reporting tasks in early testing. Grid AI Assistant allows operators to interact with complex grids using natural language to improve response times and accuracy during critical events; reducing outages by 40% and speeding up application deployment by 60%. Schneider Electric’s integration of AI tools reflects a strategic approach to digitally transforming energy management, addressing both operational resilience and sustainability imperatives.

To enhance personalized learning, streamline operations and support educators with innovative technology, the State of São Paulo’s Department of Education (SEDUC) partnered with Microsoft to equip schools with cloud and AI solutions — including Azure OpenAI, Microsoft 365, Azure and Dynamics 365. SEDUC is applying responsible AI solutions at scale to address sector priorities like delivering timely, high-quality formative feedback and reducing repetitive administrative work. With Essay Grader, teachers automate portions of grading and receive suggested feedback, freeing time for lesson design and individual support. With Question Grader, students can answer questions more openly with their own perspectives and reasoning while still receiving curated feedback typically reserved for extensive exams. By leveraging these AI-powered solutions, SEDUC is improving learning outcomes, boosting efficiency and strengthening teacher impact — anchored in equity, transparency and sound governance.

Australia’s leading telecommunications company, Telstra, is transforming its customer service operations to improve the experience for its customers and the people that serve them. One of the biggest pain points for teams is navigating multiple systems to identify and resolve a customer issue — leading to long handling times and reliance on how team members interpret various data sources. By leveraging AI solutions built on Azure OpenAI and Microsoft 365 Copilot, the company is enabling instant knowledge access and streamlined workflows. With One Sentence Summary, agents have a concise overview of customer interactions to improve efficiency and customer satisfaction — reducing call handling time by over one minute and repeat contacts by nearly 10%. Ask Telstra provides AI-generated responses from Telstra’s knowledge base in near real-time to assist agents with accurate product, plan and troubleshooting information across a wide variety of topics during calls; facilitating seamless agent-customer interactions with AI assistance.

As one of the largest leading global automakers, Toyota is pioneering AI intelligence in manufacturing with O-beya System: a multi-agent AI system simulating expert discussions virtually. Based on decades of engineering knowledge, the solution fosters a collaborative project management approach to enhance problem-solving and innovation in vehicle development while identifying key challenges to help analyze and diagnose problems. O-beya can auto-select AI agents in fields like fuel efficiency, drivability, noise and vibration, energy management and power management to pinpoint causes and suggest solutions. The system also offers interactive features; including prompt history, term explanations and creative summaries to further enable engineers to explore and validate mitigation strategies efficiently. The system leverages Microsoft Azure OpenAI, Azure AI Search and Azure Cosmos DB to analyze internal design data and help Toyota accelerate innovation, preserve institutional knowledge and resolve complex engineering issues faster. Since January 2024, over 800 powertrain engineers have accessed the system, utilizing it hundreds of times monthly across multiple business units.

As we seek to help our customers realize their AI ambitions, our mission remains unchanged: to empower every person and every organization on the planet to achieve more. We are at our best as a company when we put our technology to work for others. As you move forward on your AI journey, ask what AI can do for your organization and what it means to demand more from it. Leveraging the Microsoft portfolio, together we can do more to positively impact society; going beyond efficiency and productivity to solve for humanity’s biggest challenges. I look forward to partnering with you on your path to becoming Frontier.

Tags: Azure, Azure AI Foundry, Azure AI Search, Azure Cosmos DB, Azure OpenAI, Dynamics 365, Microsoft 365, Microsoft 365 Copilot, Microsoft Azure, Microsoft Cloud

Read the whole story
alvinashcraft
3 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Introducing Agent HQ: Any agent, any way you work

1 Share
Read the whole story
alvinashcraft
6 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Evaluating Kotlin in Real Projects

1 Share

Guest post by Urs Peter, Senior Software Engineer and JetBrains-certified Kotlin Trainer. For readers who’d like a more structured way to build Kotlin skills, Urs also leads the Kotlin Upskill Program at Xebia Academy.

This is the second post in The Ultimate Guide to Successfully Adopting Kotlin in a Java-Dominated Environment, a series that follows how Kotlin adoption grows among real teams, from a single developer’s curiosity to company-wide transformation.

Read the first part: Getting Started With Kotlin for Java Developers


The Evaluation Stage: Beyond Kotlin as a Playground

Once you’re comfortable with Kotlin in tests, it’s time for a more substantial evaluation. You have two main approaches:

  1. Build a new microservice / application in Kotlin
  2. Extend / convert an existing Java application

1. Build a new microservice/application in Kotlin

Starting fresh with a new application or microservice provides the full Kotlin experience without the constraints of legacy code. This approach often provides the best learning experience and showcases Kotlin’s strengths most clearly.

Pro tip: Get expert help during this stage. While developers are naturally confident in their abilities, avoiding early mistakes in the form of Java-ish Kotlin and a lack of Kotlin-powered libraries can save months of technical debt.

This is how you can avoid common pitfalls when using Kotlin from a Java background:

Pitfall: Choosing a different framework from the one you use in Java.

Tip: Stick to your existing framework

Most likely, you were using Spring Boot with Java, so use it with Kotlin too. Spring Boot Kotlin support is first-class, so there is no additional benefit in using something else. Moreover, you are forced to learn not only a new language but also a new framework, which only adds complexity without providing any advantage.

Important: Spring interferes with Kotlin’s ‘inheritance by design’ principle, which requires you to explicitly mark classes open in order to extend them.

In order to avoid adding the open keyword to all Spring-related classes (like @Configuration, etc.), use the following build plugin: https://kotlinlang.org/docs/all-open-plugin.html#spring-support. If you create a Spring project with the well-known online Spring initializr tool, this build plugin is already configured for you.

Pitfall: Writing Kotlin in a Java-ish way, relying on common Java APIs rather than Kotlin’s standard library: 

This list can be very long, so let’s focus on the most common pitfalls:

Pitfall 1: Using Java Stream rather than Kotlin Collections

Tip: Always use Kotlin Collections.

Kotlin Collections are fully interoperable with Java Collections, yet equipped with straightforward and feature-rich higher-order functions that make Java Stream obsolete. 

As follows is an example that aims to pick the top 3 products by revenue (price * sold) grouped by product category:

Java

record Product(String name, String category, double price, int sold){}

List<Product> products = List.of(
           new Product("Lollipop", "sweets", 1.2, 321),
           new Product("Broccoli", "vegetable", 1.8, 5);

Map<String, List<Product>> top3RevenueByCategory =
       products.stream()
          .collect(Collectors.groupingBy(
                Product::category,
                Collectors.collectingAndThen(
                    Collectors.toList(),
                    list -> list.stream()
                              .sorted(Comparator.comparingDouble(
                                  (Product p) -> p.price() * p.sold())
                                   .reversed())
                                   .limit(3)
                                   .toList()
                       		)
          )
);

Kotlin

val top3RevenueByCategory: Map<String, List<Product>> =
   products.groupBy { it.category }
       .mapValues { (_, list) ->
           list.sortedByDescending { it.price * it.sold }.take(3)
       }

Kotlin Java interop lets you work with Java classes and records as if they were native Kotlin, though you could also use a Kotlin (data) class instead.

Pitfall 2: Keeping on using Java’s Optional.

Tip: Embrace Nullable types

One of the key reasons Java developers switch to Kotlin is for Kotlin’s built-in nullability support, which waves NullPointerExceptions goodbye. Therefore, try to use Nullable types only, no more Optionals. Do you still have Optionals in your interfaces? This is how you easily get rid of them by converting them to Nullable types:

Kotlin

//Let’s assume this repository is hard to change, because it’s a library you depend on
class OrderRepository {
      //it returns Optional, but we want nullable types
      fun getOrderBy(id: Long): Optional<Order> = …
}

//Simply add an extension method and apply the orElse(null) trick
fun OrderRepository.getOrderByOrNull(id: Long): Order? = 
                                    getOrderBy(id).orElse(null)

//Now enjoy the safety and ease of use of nullable types:

//Past:
 val g = repository.getOrderBy(12).flatMap { product ->
     product.goody.map { it.name }
}.orElse("No goody found")

//Future:
 val g = repository.getOrderByOrNull(12)?.goody?.name ?: "No goody found"

Pitfall 3: Continuing to use static wrappers.

Tip: Embrace Extension methods

Extension methods give you many benefits:

  • They make your code much more fluent and readable than wrappers.
  • They can be found with code completion, which is not the case for wrappers.
  • Because Extensions need to be imported, they allow you to selectively use extended functionality in a specific section of your application.

Java

//Very common approach in Java to add additional helper methods
public class DateUtils {
      public static final DateTimeFormatter DEFAULT_DATE_TIME_FORMATTER = 
           DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");

      public String formatted(LocalDateTime dateTime, 
		              DateTimeFormatter formatter) {
         return dateTime.format(formatter);
      }

      public String formatted(LocalDateTime dateTime) {
         return formatted(dateTime, DEFAULT_DATE_TIME_FORMATTER);
      }
}

//Usage
 formatted(LocalDateTime.now());

Kotlin

val DEFAULT_DATE_TIME_FORMATTER: DateTimeFormatter = 
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")

//Use an extension method, including a default argument, which omits the need for an overloaded method. 
fun LocalDateTime.asString(
   formatter: DateTimeFormatter = DEFAULT_DATE_TIME_FORMATTER): String = 
      this.format(formatter)

//Usage
LocalDateTime.now().formatted()

Be aware that Kotlin offers top-level methods and variables. This implies that we can simply declare e.g. the DEFAULT_DATE_TIME_FORMATTER top level without the need to bind to an object like is the case in Java.

Pitfall 4: Relying on (clumsily) Java APIs

Tip: Use Kotlin’s slick counterpart. 

The Kotlin standard library uses extension methods to make Java libraries much more user-friendly, even though the underlying implementation is still Java. Almost all major third-party libraries and frameworks, like Spring, have done the same.

Example standard library:

Java

String text;
try (
       var reader = new BufferedReader(
                  new InputStreamReader(new FileInputStream("out.txt"), 
            StandardCharsets.UTF_8))) {
   text = reader
            .lines()
            .collect(Collectors.joining(System.lineSeparator()));
}
System.out.println("Downloaded text: " +  text + "\n");

Kotlin

//Kotlin has enhanced the Java standard library with many powerful extension methods, like on java.io.*, which makes input stream processing a snap due to its fluent nature, fully supported by code completion

val text = FileInputStream("path").use {
             it.bufferedReader().readText()
           }
println("Downloaded text: $text\n");

Example Spring:
Java

final var books =  RestClient.create()
       .get()
       .uri("http://.../api/books")
       .retrieve()
       .body( new ParameterizedTypeReference<List<Book>>(){}); // ⇦ inconvenient ParameterizedTypeReference

Kotlin

import org.springframework.web.client.body
val books = RestClient.create()
   .get()
   .uri("http://.../api/books")
   .retrieve()
   .body<List<Book>>() //⇦ Kotlin offers an extension that only requires the type without the need for a ParameterizedTypeReference

Pitfall 5: Using a separate file for each public class

Tip: Combine related public classes in a single file. 

This allows you to get a good understanding of how a (sub-)domain is structured without having to navigate dozens of files.

Java

Kotlin

//For domain classes consider data classes - see why below
data class User(val email: String,
            //Use nullable types for safety and expressiveness
           val avatarUrl: URL? = null, 
           var isEmailVerified: Boolean)

data class Account(val user:User,
              val address: Address,
              val mfaEnabled:Boolean,
              val createdAt: Instant)

data class Address(val street: String,
              val city: String,
              val postalCode: String)

Pitfall 6: Relying on the mutable programming paradigm

Tip: Embrace immutability – the default in Kotlin

The trend across many programming languages – including Java – is clear: immutability is winning over mutability. 

The reason is straightforward: immutability prevents unintended side effects, making code safer, more predictable, and easier to reason about. It also simplifies concurrency, since immutable data can be freely shared across threads without the risk of race conditions.

That’s why most modern languages – Kotlin among them – either emphasize immutability by default or strongly encourage it. In Kotlin, immutability is the default, though mutability remains an option when truly needed.

Here’s a quick guide to Kotlin’s immutability power pack:

1. Use val over var

Prefer val over var. IntelliJ IDEA will notify you if you used a var, for which a val could be used. 

2. Use (immutable) data classes with copy(...)

For domain-related classes, use data classes with val. Kotlin data classes are often compared with Java records. Though there is some overlap, data classes offer the killer feature copy(...), whose absence makes transforming record – which is often needed in business logic – so tedious:

Java

//only immutable state
public record Person(String name, int age) {
   //Lack of default parameters requires overloaded constructor
   public Person(String name) { 
       this(name, 0);
   }
   //+ due to lack of String interpolation
  public String sayHi() {
       return "Hello, my name is " + name + " and I am " + age + " years old.";
   }
}

//Usage
final var jack = new Person("Jack", 42);
jack: Person[name=Jack, age=5]

//The issue is here: transforming a record requires manually copying the identical state to the new instance ☹️
final var fred = new Person("Fred", jack.name);

Kotlin

//also supports mutable state (var)
data class Person(val name: String,
                  val age: Int = 0) {
  //string interpolation
  fun sayHi() = "Hi, my name is $name and I am $age years old."
}
val jack = Person("Jack", 42)
jack: Person(name=Jack, age=42)

//Kotlin offers the copy method, which, due to the ‘named argument’ feature, allows you to only adjust the state you want to change 😃
val fred = jack.copy(name = "Fred")
fred: Person(name=Fred, age=42)

Moreover, use data classes for domain-related classes whenever possible. Their immutable nature ensures a safe, concise, and hassle-free experience when working with your application’s core.     

Tip: Prefer Immutable over Mutable Collections

Immutable Collections have clear benefits regarding thread-safety, can be safely passed around, and are easier to reason about. Although Java collections offer some immutability features for Collections, their usage is dangerous because it easily causes exceptions at runtime:

Java

List.of(1,2,3).add(4); ❌unsafe 😬! .add(...) compiles, but throws UnsupportedOperationException

Kotlin

//The default collections in Kotlin are immutable (read-only)
listOf(1,2,3).add(4);  //✅safe: does not compile

val l0 = listOf(1,2,3) 
val l1 = l0 + 4 //✅safe: it will return a new list containing the added element
l1 shouldBe listOf(1,2,3,4) //✅

The same applies for using Collections.unmodifiableList(...), which is not only unsafe, but also requires extra allocation:

Java

class PersonRepo {
   private final List<Person> cache = new ArrayList<>();
   // Java – must clone or wrap every call
   public List<Person> getItems() {
       return Collections.unmodifiableList(cache);   //⚠️extra alloc
   }
}

//Usage
personRepo.getItems().add(joe) ❌unsafe 😬! .add(...) can be called but throws UnsupportedOperationException

Kotlin

class PersonRepo {

//The need to type ‘mutable’ for mutable collections is intentional: Kotlin wants you to use immutable ones by default. But sometimes you need them:

   private val cache: MutableList<Person> = mutableListOf<Person>()

   fun items(): List<Person> = cache //✅safe: though the underlying collection is mutable, by returning it as its superclass List<...>, it only exposes the read-only interface

}

//Usage
personRepo.items().add(joe) //✅safe:😬! Does not compile

When it comes to concurrency, immutable data structures, including collections, should be preferred. In Java, more effort is required with special Collections that offer a different or limited API, like CopyOnWriteArrayList. In Kotlin, on the other hand, the read-only List<...> does the job for almost all use cases. 

If you need mutable, Thread-Safe Collections, Kotlin offers Persistent Collections (persistentListOf(...), persistentMapOf(...)), which all share the same powerful interface.

Java

ConcurrentHashMap<String, Integer> persons = new ConcurrentHashMap<>();
persons.put("Alice", 23);
persons.put("Bob",   21);

//not fluent and data copying going on
Map<String, Integer> incPersons = new HashMap<>(persons.size());
persons.forEach((k, v) -> incPersons.put(k, v + 1));

//wordy and data copying going on
persons
   .entrySet()
   .stream()
   .forEach(entry -> 
      entry.setValue(entry.getValue() + 1));

Kotlin

persistentMapOf("Alice" to 23, "Bob" to 21)
         .mapValues { (key, value) -> value + 1 } //✅same rich API like any other Kotlin Map type and not data copying going on

Pitfall 7: Keeping on using builders (or even worse: trying to use Lombok) 

Tip: Use named arguments.

Builders are very common in Java. Although they are convenient, they add extra code, are unsafe, and increase complexity. In Kotlin, they are of no use, as a simple language feature renders them obsolete: named arguments. 

Java

public record Person(String name, int age) {

   // Builder for Person
   public static class Builder {
       private String name;
       private int age;

       public Builder() {}

       public Builder name(String name) {
           this.name = name;
           return this;
       }

       public Builder age(int age) {
           this.age = age;
           return this;
       }

       public Person build() {
           return new Person(name, age);
       }
   }
}

//Usage
new JPerson.Builder().name("Jack").age(36).build(); //compiles and succeeds at runtime

new JPerson.Builder().age(36).build(); //❌unsafe 😬: compiles but fails at runtime.

Kotlin

data class Person(val name: String, val age: Int = 0)

//Usage - no builder, only named arguments.
Person(name = "Jack") //✅safe: if it compiles, it always succeeds at runtime
Person(name = "Jack", age = 36) //✅

2. Extend/convert an existing Java application

If you have no greenfield option for trying out Kotlin, adding new Kotlin features or whole Kotlin modules to an existing Java codebase is the way to go. Thanks to Kotlin’s seamless Java interoperability, you can write Kotlin code that looks like Java to Java callers. This approach allows for:

  • Gradual migration without big-bang rewrites
  • Real-world testing of Kotlin in your specific context
  • Building team confidence with production Kotlin code

Rather than starting somewhere, consider these different approaches:

Outside-in:

Start in the “leaf” section of your application, e.g. controller, batch job, etc. and then work your way towards the core domain. This will give you the following advantages: 

  • Compile-time isolation: Leaf classes rarely have anything depending on them, so you can flip them to Kotlin and still build the rest of the system unchanged.
  • Fewer ripple edits. A converted UI/controller can call existing Java domain code with almost no changes thanks to seamless interop.
  • Smaller PRs, easier reviews. You can migrate file-by-file or feature-by-feature.

Inside-out:

Starting at the core and then moving to the outer layers is often a riskier approach, as it compromises the advantages of the outside-in approach mentioned above. However, it is a viable option in the following cases:

  • Very small or self-contained core. If your domain layer is only a handful of POJOs and services, flipping it early may be cheap and immediately unlock idiomatic constructs (data class, value classes, sealed hierarchies).
  • Re-architecting anyway. If you plan to refactor invariants or introduce DDD patterns (value objects, aggregates) while you migrate, it’s sometimes cleaner to redesign the domain in Kotlin first.
  • Strict null-safety contracts. Putting Kotlin at the center turns the domain into a “null-safe fortress”; outer Java layers can still send null, but boundaries become explicit and easier to police.

Module by module

  • If your architecture is organized by functionality rather than layers, and the modules have a manageable size, converting them one by one is a good strategy.

Language features for converting Java to Kotlin

Kotlin offers a variety of features – primarily annotations – that allow your Kotlin code to behave like native Java. This is especially valuable in hybrid environments where Kotlin and Java coexist within the same codebase.
Kotlin

class Person @JvmOverloads constructor(val name: String,
                          var age: Int = 0) {
  companion object {

  @JvmStatic
  @Throws(InvalidNameException::class)
  fun newBorn(name: String): Person = if (name.isEmpty()) 
       throw InvalidNameException("name not set")
     else Person(name, 0)

   @JvmField
   val LOG = LoggerFactory.getLogger(KPerson.javaClass)
  }
}

Java

//thanks to @JvmOverloads an additional constructor is created, propagating Kotlin’s default arguments to Java
var john =  new Person("John");

//Kotlin automatically generates getters (val) and setters (var) for Java
john.setAge(23);
var name = ken.getName();

//@JvmStatic and @JvmField all accessing (companion) object fields and methods as statics in Java

//Without @JvmStatic it would be: Person.Companion.newBorn(...)
var ken =  Person.newBorn("Ken"); 

//Without @JvmField it would be: Person.Companion.LOG
Person.LOG.info("Hello World, Ken ;-)");

//@Throws(...) will put the checked Exception in the method signature 
try {
  Person ken =  Person.newBorn("Ken");
} catch (InvalidNameException e) {
  //…
}

Kotlin

@file:JvmName("Persons")
package org.abc

@JvmName("prettyPrint")

fun Person.pretty() =
      Person.LOG.info("$name is $age old")

Java

//@JvmName for files and methods makes accessing static fields look like Java: without it would be: PersonKt.pretty(...)
Persons.prettyPrint(ken)

IntelliJ IDEA’s Java to Kotlin Converter

IntelliJ IDEA offers a Java to Kotlin Converter, so theoretically, the tool can do it for you. However, the resulting code is far from perfect, so use it only as a starting point. From there, convert it to a more Kotlin-esque representation. More on this topic will be discussed in the final section of this blog post series: Success Factors for Large-Scale Kotlin Adoption.

Taking Java as a starting point will most likely make you write Java-ish Kotlin, which gives you some benefits, but will not unleash the power of Kotlin’s potential. Therefore, writing a new application is the approach I prefer. 

Next in the series

This installment in our Ultimate Guide to Successfully Adopting Kotlin in a Java-Dominated Environment series of blog posts demonstrated how Kotlin experiments can evolve into production code. Our next post focuses on the human side of adoption: convincing your peers. It explains how to present clear, code-driven arguments, guide new developers, and create a small but lasting Kotlin community within your team.

Urs Peter

Urs is a seasoned software engineer, solution architect, conference speaker, and trainer with over 20 years of experience in building resilient, scalable, and mission-critical systems, mostly involving Kotlin and Scala.

Besides his job as a consultant, he is also a passionate trainer and author of a great variety of courses ranging from language courses for Kotlin and Scala to architectural trainings such as Microservices and Event-Driven Architectures.

As a people person by nature, he loves to share knowledge and inspire and get inspired by peers on meetups and conferences. Urs is a JetBrains certified Kotlin trainer.

Read the whole story
alvinashcraft
6 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Work smarter with Copilot in the People, Files, and Calendar apps

1 Share

Hey, Insiders! I’m Yash Kamalanath, a Principal Product Manager on the Microsoft 365 companion apps team. I’m excited to share that the companion apps – People, Files, and Calendar – are now smarter than ever: Microsoft 365 Copilot is built into People and Files, with Copilot in Calendar coming soon. These smart agents make finding what you need and getting things done in Windows lightning fast.

Work smarter with Copilot in the People, Files, and Calendar apps

Looking up a colleague, finding a file, or checking what’s next on your calendar should be effortless, but it’s easy to get distracted between tasks. With People, Files, and Calendar apps, everything you need is just a click away on the taskbar. These Copilot-integrated apps surface the right content in real time so you can stay focused and keep your work moving.

With Copilot, each companion app will be grounded in your work data – people, files, meetings – making them the fastest and easiest way to prompt for relevant questions. Companion apps offer instant suggestions for every item, plus a freeform box for your own prompts. For example, you can catch up on the latest from your top collaborators, flag comments that need your input, or recap meetings you missed. Start simple with a search in a companion app and seamlessly hand off to the Microsoft 365 Copilot app with full context for more complex inquiries – no extra steps needed.

Copilot in People: Pick up on updates from your collaborators

With Copilot, the People app goes beyond names and title searches – it surfaces recent communications, highlights key responsibilities, and suggests tailored prompts to help you connect and collaborate with teammates across your organization.

Gather more insights by asking Copilot:

  • “Tell me about John.”
  • “What’s the latest from John?”
  • “Show me follow up tasks with John.”

Copilot in Files: Learn more about your content

In the Files app, you can start a Copilot conversation directly from the content to summarize documents or presentations, review changes, analyze data, and create action items – without breaking your flow.

Gather more insights by asking Copilot:

  • “What’s the context for this?”
  • “Summarize this workbook.”
  • “Highlight key figures or trends.”

Copilot in Calendar: Keep your workday on track

Coming soon, Copilot will be integrated into the Calendar app, where you will be able to get meeting summaries and prep material to catch up and prepare for your day, manage your schedule in real time from your taskbar, and get up to speed in seconds on missed conversations. 

Gather more insights by asking Copilot:

  • “Suggest talking points for this meeting.”
  • “Check for action items.”
  • “Key takeaways from this meeting.”

Availability

Copilot in the companion apps is available for Windows 11 users who have Microsoft 365 companion apps installed, are on either Enterprise or Business SKUs, and have a Microsoft 365 Copilot license. Copilot in People and Files is available immediately, and Copilot in Calendar will become available soon. As an admin, you can pin the Microsoft 365 Copilot app together with selected companion apps to the Windows 11 taskbar on Intune-managed devices. This provides users with quick access to Copilot features such as Chat, Search, and Agents.

Learn more about setting up People, Files, and Calendar: Microsoft 365 companions apps overview - Microsoft 365 Apps

Learn more about pinning Microsoft 365 Copilot and its companion apps to the Windows taskbar: Pin Microsoft 365 Copilot and its companion apps to the Windows taskbar.

Feedback

We’re excited to bring you these new capabilities, and we’d love to hear your thoughts on how these companion apps are working for you! Share feedback or suggestions anytime using the Give Feedback button in the top-right corner of any of the companion apps.

 

Learn about the Microsoft 365 Insider program and sign up for the Microsoft 365 Insider newsletter to get the latest information about Insider features in your inbox once a month!

Read the whole story
alvinashcraft
6 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Evolving eCommerce Shipping Solutions

1 Share

In this episode of Mailin’ It!, hosts Karla Kirby and Jeff Marino jump into the fast-paced world of e-commerce shipping with guest Heather Maday, Senior Director of Sales Enablement at the US Postal Service. Heather shares how USPS has transformed its operations to support today’s eCommerce economy, where shoppers expect speed, visibility, and value with every order. From data driven logistics and new fulfillment technologies to transparent pricing, Heather explains how USPS helps businesses reach customers affordably and efficiently.


Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.





Download audio: https://afp-920619-injected.calisto.simplecastaudio.com/f32cca5f-79ec-4392-8613-6b30c923629b/episodes/efb1a195-2647-40cd-9c21-d318c8af9c65/audio/128/default.mp3?aid=rss_feed&awCollectionId=f32cca5f-79ec-4392-8613-6b30c923629b&awEpisodeId=efb1a195-2647-40cd-9c21-d318c8af9c65&feed=bArttHdR
Read the whole story
alvinashcraft
6 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Theme Parks, Bakeries, and G-Code: Samantha’s Unexpected Path Into Software

1 Share

In this episode, I talk with Samantha Lopez, whose path into software development is anything but typical -- and that’s exactly what makes it so inspiring.


Whether you’re just starting out or pivoting mid-career, Samantha’s journey proves that your past experiences can be your biggest asset in tech -- you just have to connect the dots and keep going!


----

You can find Samantha at:

- LinkedIn: https://www.linkedin.com/in/samlopezdev/

- GitHub: https://github.com/samlopezdev

- Portfolio: https://samlopezdev.netlify.app/


----

🎥 Channels:


🔑 Membership & Subscriptions:


🧠 Courses:


🗣️ Social Media & Links:







Download audio: https://anchor.fm/s/f7b5ab38/podcast/play/110090479/https%3A%2F%2Fd3ctxlq1ktw2nl.cloudfront.net%2Fstaging%2F2025-9-23%2F409809056-44100-2-a1003fd8178b1.mp3
Read the whole story
alvinashcraft
6 hours ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories