100% de satisfacción garantizada Inmediatamente disponible después del pago Tanto en línea como en PDF No estas atado a nada 4.2 TrustPilot
logo-home
Notas de lectura

College notes Deep Learning deel 2 (XM_0083), Master VU AI

Puntuación
-
Vendido
-
Páginas
68
Subido en
23-02-2022
Escrito en
2021/2022

Those are the notes I made during the whole course, it is not a summary! As long as you understand this document you do not need to watch any lecture. With those notes I have passed the course with a 7.5.

Institución
Grado











Ups! No podemos cargar tu documento ahora. Inténtalo de nuevo o contacta con soporte.

Escuela, estudio y materia

Institución
Estudio
Grado

Información del documento

Subido en
23 de febrero de 2022
Número de páginas
68
Escrito en
2021/2022
Tipo
Notas de lectura
Profesor(es)
Jakub tomczak
Contiene
9-14

Temas

Vista previa del contenido

Lecture 9
Learning with Graphs

introduction - graphs

for given graph you should know whether it has:

undirected, simple graph

undirected, multigraph: more lines between two nodes

directed

self loop possible

node/edge labels are they unique

properties on edge:

edge weight

RDF (unique id/link for each node)

any comb is possible

introduction - embeddings

embedding = low dimensional representations of objects

low dimension = much lower as the original size

representation = there is some meaning to it, a representation corresponds to sth (objects in real world)

objects = words, sentences, images, audio, graphs,...




this output is the representation of the input object, and so it the embedding of it



some embedding spaces have an useful structure in which you can navigate in certain directions with
changing features according to direction = navigable space, so direction corresponds to certain semantic
features in object




Deep Learning 98

, embedding of images

embeddings of words

distributional hypothesis

"you shall know a word by the company it keeps"

"if units of text have similar vectors in a text freq matrix, then they tend to have similar meaning"

⇒ we can use context information to define embeddings for words,
one vector representation for each word

generally, we can train them (see lecture on word2vec)

illustration idea




why use graphs as input to ML?

classification, regression, clustering of nodes/edges/whole graphs

recommender systems

document modeling: entity and document similarity (use concepts from a graph)

alignment of graphs (which nodes are similar)

link prediction and error detection

linking text and semi-structured knowledge to graphs

graph embedding techniques

one big challenge with embedding graphs

we can not straightforwardly put graph into neural network because of its structure, not a linear
structure

there are traditional ML on graphs

often having problems with scalability

often need manual feature engineering: task specific

embedding knowledge graphs in vector space

for each node in graph → create vector in vector space → easily feed into ML algo

preserve info

unsupervised: task and dataset independent

efficient computation

low dimensional representation




Deep Learning 99

, 2 major visions on how this should be done

Preserve topology

keeping neighbors close in embedding space: e.g. europe-germany

Preserve similarity

keeping similar nodes close together: e.g. europe-africa

2 major targets

improve original data

knowledge graph completion

link prediction

anomaly detection

downstream ML tasks

classification/regression/clustering/K-NN

then used as part of a larger process

QA/dialog systems

Translation

image segmentation

how to go from graph to embedding? 3 major approaches for propositionalization

translation: any relation in knowledge graph is directly mapped to what should happen in the
embedded space. So we want to see the same relations in the embedded space

has a lot of approaches, but all come down to same thing

TransE is the basic one, and all the other ones are the complex version of it

= translation embedding

take every edge of graph → if this exists in knowledge graph → we want to have it in our
embedded space with edge embedding, vector as translation form that one to that one




so adding embedding to head has to give tail




we want to minimalize distance between sum of the h and l (as close as possible) to t in
knowledge graph S




Deep Learning 100

, this is not completely sufficient, seems to work but the problem is that we only get positive
information ⇒ over optimized

so not only minimizing but also penalizing when wrong relations are given: (maximize bad
triple)




become less scalable because they do more and more stuff
one hop away from each other = current entity → one hop → next entity

tensor/matrix factorization

make a 3d matrix of all relations in the graph and factorize it [factorizing = make a lower
dimensional representation of the 3d matrix also with lower matrix dimensionality]




Deep Learning 101
$5.26
Accede al documento completo:

100% de satisfacción garantizada
Inmediatamente disponible después del pago
Tanto en línea como en PDF
No estas atado a nada


Documento también disponible en un lote

Conoce al vendedor

Seller avatar
Los indicadores de reputación están sujetos a la cantidad de artículos vendidos por una tarifa y las reseñas que ha recibido por esos documentos. Hay tres niveles: Bronce, Plata y Oro. Cuanto mayor reputación, más podrás confiar en la calidad del trabajo del vendedor.
MeldaMalkoc Vrije Universiteit Amsterdam
Seguir Necesitas iniciar sesión para seguir a otros usuarios o asignaturas
Vendido
54
Miembro desde
3 año
Número de seguidores
34
Documentos
20
Última venta
5 meses hace

3.3

7 reseñas

5
2
4
1
3
2
2
1
1
1

Recientemente visto por ti

Por qué los estudiantes eligen Stuvia

Creado por compañeros estudiantes, verificado por reseñas

Calidad en la que puedes confiar: escrito por estudiantes que aprobaron y evaluado por otros que han usado estos resúmenes.

¿No estás satisfecho? Elige otro documento

¡No te preocupes! Puedes elegir directamente otro documento que se ajuste mejor a lo que buscas.

Paga como quieras, empieza a estudiar al instante

Sin suscripción, sin compromisos. Paga como estés acostumbrado con tarjeta de crédito y descarga tu documento PDF inmediatamente.

Student with book image

“Comprado, descargado y aprobado. Así de fácil puede ser.”

Alisha Student

Preguntas frecuentes