
Microsoft continues to evolve the AI capabilities in Teams, and its latest update brings a significant leap forward in meeting intelligence. Copilot in Teams can now analyze content shared on-screen during a meeting when recording is enabled, giving it a far richer understanding of the discussion context and visual information presented.
This enhancement means Copilot can now draw insights from three key data streams:
The meeting transcript – capturing what participants said.
The meeting chat – reflecting real-time feedback and reactions.
Screen-shared content – encompassing any visuals, documents, or data shown during screen sharing.
Together, these sources allow Copilot to deliver far more nuanced and context-aware responses.
What This Means in Practice
When a participant shares their screen be it a PowerPoint presentation, Excel dashboard, document, or even a web app Copilot can interpret that visual information to answer complex, context-specific questions.
For example:
“Which products had the highest sales?” Copilot can reference the data displayed on the shared spreadsheet.
“What was the feedback per slide?” Copilot can consolidate comments from both the transcript and chat to surface slide-by-slide insights.
“Rewrite the paragraph shared on the screen incorporating the feedback from the chat.” Copilot can blend visual and conversational context to generate new content instantly.
This capability makes Copilot more than just a meeting summary tool. It becomes a cross-context reasoning system, connecting what was said, shown, and shared into actionable insights.
How It Works
The feature activates when meeting recording is enabled, allowing Copilot to access and process both the transcript and visual data shared during the call. Importantly, this applies to desktop screen sharing and works across any platform or app from internal systems and third-party websites to PDFs and design tools.
This open approach ensures that organizations don’t need to change how they present content; Copilot adapts to the meeting flow as it happens.
Support for PowerPoint Live and Whiteboard in Teams is on the roadmap, promising even deeper integration with Microsoft 365 collaboration experiences.
Why It Matters
From an IT and digital workplace perspective, this update represents a step toward context-aware AI collaboration where assistants don’t just summarize but truly understand the full scope of a meeting.
For enterprises, this can mean:
Faster post-meeting documentation, since Copilot can generate summaries or follow-ups based on both conversation and visual materials.
More accurate insights, as Copilot can factor in the data shown on-screen rather than relying solely on transcripts.
Improved content creation workflows, with AI capable of rewriting or refining materials based on real-time feedback.
By bridging the gap between spoken conversation and shared visual content, Copilot is redefining what it means to capture organizational knowledge in hybrid meetings.
Sum-up
As Microsoft continues to expand Copilot’s scope eventually adding support for PowerPoint Live, Whiteboard, and other collaborative surfaces Teams is becoming a central hub for intelligent meeting analytics.
This update reinforces the direction of AI-augmented collaboration, where the goal isn’t just recording what happened in a meeting but turning every conversation, comment, and slide into insight-driven action.
