How Microsoft 365 Copilot works

How Microsoft 365 Copilot works


How Microsoft 365 Copilot works

Get an inside look at how large language models (LLMs) work when you connect them to the data in your organization. See what makes this possible and how the process respects your privacy to keep data safe with Microsoft 365 Copilot. The LLM for Microsoft 365 Copilot is hosted in the Microsoft Cloud and is not trained on your organizational data. Copilot automatically inherits your organization’s security, compliance, and privacy policies for Microsoft 365.

Join Mary David Pasch to go inside the mechanics of AI-powered Copilot capabilities, what they do and how they work.

► QUICK LINKS:
00:00 - Introduction
01:11 - Large Language Models (LLMs)
02:19 - Write prompts to include additional info
04:04 - Core components of Copilot
06:31 - Generating content
07:48 - Wrap up

► Link References:
For more on how Microsoft operates its AI services, check out https://aka.ms/MicrosoftResponsibleAI

► Unfamiliar with Microsoft Mechanics?
As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft.

• Subscribe to our YouTube:    / microsoftmechanicsseries  
• Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t
• Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com

► Keep getting this insider knowledge, join us on social:
• Follow us on Twitter: https://twitter.com/MSFTMechanics
• Share knowledge on LinkedIn: https://www.linkedin.com/company/micr
• Enjoy us on Instagram: https://www.instagram.com/msftmechanics/
• Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics


#Copilot #OpenAI #Microsoft365 #Microsoft365Copilot #gpt #chatgpt


Content

0.119 -> (music)
3 -> Have you ever wanted to know
4.29 -> how large language models work when you connect them
7.05 -> to the data in your organization?
9.63 -> At Microsoft, we recently demonstrated
11.58 -> Microsoft 365 Copilot,
13.89 -> which transforms how we work
15.75 -> by leveraging large language models
17.73 -> that interact with your organizational data.
20.55 -> Copilot works alongside you.
22.77 -> For example, in Word,
24.45 -> Copilot can easily write an entirely new document,
27.36 -> like a business proposal using content
29.64 -> from your existing files.
31.2 -> Or in Outlook, based on the content you select,
33.63 -> Copilot can compose your email replies for you.
36.9 -> In PowerPoint, you can transform your written content
39.45 -> into a visually beautiful presentation
41.61 -> with the click of a button.
43.38 -> In Teams, Copilot can generate meeting summaries
46.2 -> with discussed follow-up actions.
48.6 -> Or while using Business Chat in Microsoft Teams,
51.57 -> it can help you catch up on something you may have missed,
54.45 -> bringing together information from multiple sources
56.91 -> to bring you up to speed.
58.92 -> If you're wondering how large language models know
60.99 -> what they know in these scenarios,
62.94 -> let me break down the mechanics
64.38 -> of what makes this possible,
65.91 -> and how the process respects your privacy,
68.01 -> and keeps your data safe with Microsoft 365 Copilot.
71.49 -> First, let's look at where large language models, or LLMs,
75.51 -> get their knowledge.
76.95 -> LLMs are trained on massive amounts of public data,
80.25 -> including books, articles, and websites
82.8 -> to learn language, context and meaning.
85.32 -> You can interact with large language models
87.42 -> using natural language with what's called a prompt.
90.57 -> A prompt is typically a statement or question.
93.54 -> When you ask a question in the prompt,
95.46 -> the LLM generates a response
97.17 -> based on its public data training
98.79 -> and understanding of context,
100.62 -> which can come in part from how you phrase your prompt.
103.35 -> For example, you might give it more details
105.39 -> to generate a response.
107.49 -> As you continue to ask questions and get responses,
110.31 -> the large language model
111.36 -> is temporarily getting more context.
113.7 -> Your full conversation gets sent
115.17 -> with each subsequent prompt,
116.76 -> so the LLM can generate relevant responses
119.13 -> as you chat with it.
120.9 -> It's processing natural language
122.37 -> and referring to its knowledge
123.75 -> like we would in conversation.
125.88 -> A key difference is that it only remembers the conversation
128.88 -> while it's in that conversation.
130.86 -> The chat history is wiped clean with each new conversation.
134.64 -> And it won't use the knowledge from your conversations
136.74 -> and interactions to train the model.
138.84 -> That said, you can also write your prompt
141.39 -> to include additional information,
143.34 -> which the large language model will refer to
145.77 -> as it generates its response.
147.63 -> This is how you can give the LLM a little more knowledge
150.12 -> it might need to answer your question.
152.7 -> I'll show you how this works
153.93 -> using Microsoft Bing Chat's GPT-enabled public service
157.29 -> that has no affiliation with your organization's data.
160.38 -> First, I'll ask it a completely random question
162.9 -> that it can't answer, "What color shirt am I wearing today?"
166.35 -> And it responds intelligently.
168.24 -> It knows what a shirt is but it can't see me
170.43 -> to answer my question so it responds accordingly,
173.55 -> which is an accurate response.
175.8 -> Let me ask the question again,
177.27 -> this time including some additional information
179.05 -> in my prompt.
180.36 -> I'll describe my outfit.
182.07 -> Now you can see it responds
183.33 -> using the information I gave it,
184.92 -> which is more in line with what I was looking for.
187.71 -> And now that it has the context,
189.36 -> I can keep asking it related questions like,
191.917 -> "What color shoes?"
193.65 -> Again, that's because the prompt builds
195.72 -> with each interaction.
197.31 -> And to prove that the large language model
198.99 -> doesn't retain the information,
200.76 -> I'll start a new chat session and ask it again,
203.557 -> "What color shirt am I wearing today?"
205.74 -> And now it again says, "I can't see you, so I don't know."
209.61 -> It knew what shirt I was wearing before
211.53 -> only because I temporarily provided
213.45 -> that additional limited information.
215.64 -> In this new session, it no longer has access
218.1 -> to what I said before, and I never told it my shirt color,
220.95 -> so it doesn't know.
222.09 -> So how does this work then in the context
224.19 -> of Microsoft 365 Copilot?
226.56 -> In my previous example using Bing Chat,
229.08 -> I provided the prompt more information and context
231.66 -> to give the LLM what it needed
233.16 -> to generate the right response.
235.38 -> This is what the Microsoft 365 Copilot system
238.38 -> does automatically for you as you interact
241.05 -> across different app experiences.
243.12 -> To do this, Copilot has several core components.
246.15 -> First off, are the large language models
248.25 -> hosted in the Microsoft Cloud via the Azure OpenAI service.
252.45 -> To be clear, Copilot is not calling
254.94 -> the public OpenAI service that powers ChatGPT.
258.3 -> Microsoft 365 Copilot uses its own private instances
262.26 -> of the large language models.
263.85 -> Next, Microsoft 365 Copilot
266.31 -> has a powerful orchestration engine
268.5 -> that I'll explain in a moment.
269.94 -> Copilot capabilities are surfaced in
271.89 -> and work with Microsoft 365 apps.
275.01 -> Microsoft Search is used for information retrieval
277.71 -> to feed prompts, like I did in the example before
280.74 -> where information I provided in my prompt
282.84 -> was used to help generate an answer.
285.48 -> Then the Microsoft Graph,
286.95 -> which has long been foundational to Microsoft 365,
290.16 -> includes additional information about the relationships
292.83 -> and activities over your organization's data.
295.32 -> The Copilot system respects per user access permissions
299.01 -> to any content and Graph information it retrieves.
302.55 -> This is important because Microsoft 365 Copilot
306 -> will only generate responses based on information
310.971 -> Now let's go back to the example you saw earlier
313.011 -> in Microsoft Teams where a user asked,
314.968 -> "Did anything happen yesterday with Fabrikam?"
317.721 -> Copilot didn't just send that question or prompt directly
320.541 -> to the large language model.
322.341 -> Instead, Copilot knew that it needed more knowledge
325.221 -> and context, so using clues from the user's question,
328.071 -> like Fabrikam, it inferred that it needed to search
331.461 -> for content sources private to the organization.
334.431 -> The Copilot orchestrator searched the Microsoft Graph
336.981 -> for activities, ensuring it respected the user's permissions
340.281 -> and access to information, in this case, the user Kat.
343.581 -> It found the email thread from Mona that Kat received,
346.611 -> activities in the Project Checklist
348.591 -> and March planning presentation,
350.631 -> which are files that Kat had access to,
353.121 -> as well as the sharing action where the final contract
355.731 -> was sent to Fabrikam for review,
357.681 -> again, where Kat would have been part of the share activity.
361.011 -> And Copilot cited each source of information
363.381 -> so Kat could easily validate the response.
366.531 -> These are all individual steps
367.821 -> that Kat could have done manually,
369.921 -> like searching her inbox for emails from Mona
372.441 -> looking at recent project file activities in SharePoint
375.561 -> or reading the sharing notifications sent
377.211 -> to Fabrikam for the contract.
379.401 -> Copilot removed the tediousness
381.591 -> of performing these steps manually
383.601 -> and formulated a natural easy-to-follow
385.761 -> and concise response in a single step.
388.821 -> So that's how Business Chat with Copilot works.
391.191 -> Now, in the examples I showed you earlier,
393.411 -> you also saw how Microsoft 365 Copilot
396.651 -> can help save you time
397.881 -> in the apps you're working in by generating content.
401.031 -> In fact, let's go back to the Copilot and Word example
404.361 -> to explain how that worked.
406.371 -> Microsoft 365 Copilot can help generate a draft proposal
410.301 -> by using content you've been working on, for example,
412.821 -> in OneNote or other documents that you have access to,
416.001 -> like Word or PowerPoint files.
418.521 -> Here we combine the large language model's training
421.041 -> on how a proposal document is structured
423.141 -> and written with Microsoft 365 Copilot orchestration,
427.401 -> which scans and takes relevant inputs
429.471 -> from additional documents you've selected,
431.661 -> adding the information to the prompt.
433.851 -> The LLM is then able to generate
435.951 -> an entirely new proposal document
438.441 -> with the additional information from those files,
440.931 -> providing a first draft that you can use to save time
443.811 -> and quickly get started.
446.031 -> And just like with the Business Chat example,
448.401 -> the important thing to remember here
449.931 -> is that the enterprise data used
451.311 -> to generate informed responses is only present as part
454.731 -> of a prompt to the large language model.
457.131 -> These prompts are not retained
458.391 -> by the large language models nor used to train them,
461.571 -> and all retrieved information is based
464.091 -> on the individual data access
465.681 -> and permissions you have while using Copilot.
468.231 -> So hopefully that explains how Copilot capabilities
471.171 -> in Microsoft 365 work.
473.301 -> For more information
474.261 -> on how Microsoft operates its AI services,
477.201 -> check out aka.ms/MicrosoftResponsibleAI.
481.671 -> Please keep checking back to Microsoft Mechanics
483.801 -> for the latest in tech updates, and thanks for watching.
486.867 -> (gentle music)

Source: https://www.youtube.com/watch?v=B2-8wrF9Okc