100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.6 TrustPilot
logo-home
Class notes

Oxford MSc AI for Business: Generative AI & LLMs – Complete 2026 Technical & Strategic Guide

Rating
-
Sold
-
Pages
3
Uploaded on
23-01-2026
Written in
2025/2026

Master the cutting edge of 2026 AI! This Oxford MSc AI for Business series covers Generative AI and Large Language Models (LLMs). These notes break down the Transformer Architecture, the science of Fine-Tuning (LoRA/PEFT), and the industry-standard RAG (Retrieval-Augmented Generation) framework. Learn how to move beyond basic prompting to build Agentic Systems that automate entire business workflows. Includes practical "Governance Checklists" for LLM deployment and "Oxford Exam Focus" points on AI alignment.

Show more Read less
Institution
Course








Whoops! We can’t load your doc right now. Try again or contact support.

Written for

Institution
Study
Course

Document information

Uploaded on
January 23, 2026
Number of pages
3
Written in
2025/2026
Type
Class notes
Professor(s)
Unknown
Contains
All classes

Subjects

Content preview

UNIVERSITY OF OXFORD | MSC AI
FOR BUSINESS
MODULE: GENERATIVE AI & LARGE LANGUAGE MODELS (LLMs)
TERM 1 LECTURE NOTES: TRANSFORMERS, RAG, AND AGENTIC SYSTEMS




1. THE GENERATIVE REVOLUTION
In 2026, Generative AI (GenAI) is defined as a class of AI that can create new
content (text, images, code, video) by learning the underlying statistical patterns of
human-generated data.
1.1 The Shift to "Foundation Models"
The Oxford curriculum focuses on Foundation Models—massive models trained on
broad data that can be adapted to a wide range of downstream business tasks.
• Key Characteristic: Emergent properties (abilities like reasoning or coding
that weren't explicitly programmed but "emerged" during scale).




2. CORE ARCHITECTURE: THE TRANSFORMER
Everything in modern LLMs starts with the Transformer architecture (introduced by
Google in the "Attention is All You Need" paper).
• The Attention Mechanism: Allows the model to weigh the importance of
different words in a sentence, regardless of their distance.
• Tokenization: LLMs do not read "words"; they read Tokens (numerical
representations of word fragments).
• Positional Encoding: Since Transformers process all tokens simultaneously
(parallelization), they use positional encodings to remember the order of
the words.
$8.09
Get access to the full document:

100% satisfaction guarantee
Immediately available after payment
Both online and in PDF
No strings attached

Get to know the seller
Seller avatar
STEPFURTHER

Also available in package deal

Get to know the seller

Seller avatar
STEPFURTHER ASIA E UNIVERSITY
Follow You need to be logged in order to follow users or courses
Sold
New on Stuvia
Member since
2 weeks
Number of followers
0
Documents
16
Last sold
-
A Step Further

A Step Further for a Big Future A STEP FURTHER is an educational digital shop created to help students learn smarter, understand faster, and build a strong future. We provide high-quality study notes, mini books, and practical learning resources designed especially for A/L, Diploma, Higher Diploma, and Degree students. Our content is:

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions