100% tevredenheidsgarantie Direct beschikbaar na je betaling Lees online óf als PDF Geen vaste maandelijkse kosten 4.2 TrustPilot
logo-home
College aantekeningen

College notes Deep Learning deel 2 (XM_0083), Master VU AI

Beoordeling
-
Verkocht
-
Pagina's
68
Geüpload op
23-02-2022
Geschreven in
2021/2022

Those are the notes I made during the whole course, it is not a summary! As long as you understand this document you do not need to watch any lecture. With those notes I have passed the course with a 7.5.

Instelling
Vak











Oeps! We kunnen je document nu niet laden. Probeer het nog eens of neem contact op met support.

Geschreven voor

Instelling
Studie
Vak

Documentinformatie

Geüpload op
23 februari 2022
Aantal pagina's
68
Geschreven in
2021/2022
Type
College aantekeningen
Docent(en)
Jakub tomczak
Bevat
9-14

Onderwerpen

Voorbeeld van de inhoud

Lecture 9
Learning with Graphs

introduction - graphs

for given graph you should know whether it has:

undirected, simple graph

undirected, multigraph: more lines between two nodes

directed

self loop possible

node/edge labels are they unique

properties on edge:

edge weight

RDF (unique id/link for each node)

any comb is possible

introduction - embeddings

embedding = low dimensional representations of objects

low dimension = much lower as the original size

representation = there is some meaning to it, a representation corresponds to sth (objects in real world)

objects = words, sentences, images, audio, graphs,...




this output is the representation of the input object, and so it the embedding of it



some embedding spaces have an useful structure in which you can navigate in certain directions with
changing features according to direction = navigable space, so direction corresponds to certain semantic
features in object




Deep Learning 98

, embedding of images

embeddings of words

distributional hypothesis

"you shall know a word by the company it keeps"

"if units of text have similar vectors in a text freq matrix, then they tend to have similar meaning"

⇒ we can use context information to define embeddings for words,
one vector representation for each word

generally, we can train them (see lecture on word2vec)

illustration idea




why use graphs as input to ML?

classification, regression, clustering of nodes/edges/whole graphs

recommender systems

document modeling: entity and document similarity (use concepts from a graph)

alignment of graphs (which nodes are similar)

link prediction and error detection

linking text and semi-structured knowledge to graphs

graph embedding techniques

one big challenge with embedding graphs

we can not straightforwardly put graph into neural network because of its structure, not a linear
structure

there are traditional ML on graphs

often having problems with scalability

often need manual feature engineering: task specific

embedding knowledge graphs in vector space

for each node in graph → create vector in vector space → easily feed into ML algo

preserve info

unsupervised: task and dataset independent

efficient computation

low dimensional representation




Deep Learning 99

, 2 major visions on how this should be done

Preserve topology

keeping neighbors close in embedding space: e.g. europe-germany

Preserve similarity

keeping similar nodes close together: e.g. europe-africa

2 major targets

improve original data

knowledge graph completion

link prediction

anomaly detection

downstream ML tasks

classification/regression/clustering/K-NN

then used as part of a larger process

QA/dialog systems

Translation

image segmentation

how to go from graph to embedding? 3 major approaches for propositionalization

translation: any relation in knowledge graph is directly mapped to what should happen in the
embedded space. So we want to see the same relations in the embedded space

has a lot of approaches, but all come down to same thing

TransE is the basic one, and all the other ones are the complex version of it

= translation embedding

take every edge of graph → if this exists in knowledge graph → we want to have it in our
embedded space with edge embedding, vector as translation form that one to that one




so adding embedding to head has to give tail




we want to minimalize distance between sum of the h and l (as close as possible) to t in
knowledge graph S




Deep Learning 100

, this is not completely sufficient, seems to work but the problem is that we only get positive
information ⇒ over optimized

so not only minimizing but also penalizing when wrong relations are given: (maximize bad
triple)




become less scalable because they do more and more stuff
one hop away from each other = current entity → one hop → next entity

tensor/matrix factorization

make a 3d matrix of all relations in the graph and factorize it [factorizing = make a lower
dimensional representation of the 3d matrix also with lower matrix dimensionality]




Deep Learning 101
$5.26
Krijg toegang tot het volledige document:

100% tevredenheidsgarantie
Direct beschikbaar na je betaling
Lees online óf als PDF
Geen vaste maandelijkse kosten


Ook beschikbaar in voordeelbundel

Maak kennis met de verkoper

Seller avatar
De reputatie van een verkoper is gebaseerd op het aantal documenten dat iemand tegen betaling verkocht heeft en de beoordelingen die voor die items ontvangen zijn. Er zijn drie niveau’s te onderscheiden: brons, zilver en goud. Hoe beter de reputatie, hoe meer de kwaliteit van zijn of haar werk te vertrouwen is.
MeldaMalkoc Vrije Universiteit Amsterdam
Volgen Je moet ingelogd zijn om studenten of vakken te kunnen volgen
Verkocht
54
Lid sinds
3 jaar
Aantal volgers
34
Documenten
20
Laatst verkocht
5 maanden geleden

3.3

7 beoordelingen

5
2
4
1
3
2
2
1
1
1

Recent door jou bekeken

Waarom studenten kiezen voor Stuvia

Gemaakt door medestudenten, geverifieerd door reviews

Kwaliteit die je kunt vertrouwen: geschreven door studenten die slaagden en beoordeeld door anderen die dit document gebruikten.

Niet tevreden? Kies een ander document

Geen zorgen! Je kunt voor hetzelfde geld direct een ander document kiezen dat beter past bij wat je zoekt.

Betaal zoals je wilt, start meteen met leren

Geen abonnement, geen verplichtingen. Betaal zoals je gewend bent via iDeal of creditcard en download je PDF-document meteen.

Student with book image

“Gekocht, gedownload en geslaagd. Zo makkelijk kan het dus zijn.”

Alisha Student

Veelgestelde vragen