|
|
Inhaltsverzeichnis
|
|
|
Einleitung
|
|
|
Projektrahmen
|
|
|
Erwartungshaltung und Bewertung
|
|
|
Informationen/Grundlagen
|
|
|
Server
|
|
|
Our Servers
|
|
|
List of servers
|
|
|
Reserving GPUs
|
|
|
Connecting
|
|
|
OpenVPN
|
|
|
SSH tunneling
|
|
|
SSH config
|
|
|
Setting up your environment on the server
|
|
|
Pip
|
|
|
Conda
|
|
|
Getting data to the server
|
|
|
Git (favorite)
|
|
|
Scp for larger files/directories
|
|
|
Rsync (for example with PyCharm)
|
|
|
Tmux
|
|
|
Usage (detaching, etc)
|
|
|
Example config
|
|
|
Nvidia-smi and choosing the appropriate CUDA device
|
|
|
Htop
|
|
|
|
|
|
|
|
|
Deep Learning
|
|
|
NLP
|
|
|
Preprocessing
|
|
|
Tipps
|
|
|
Tokenizer
|
|
|
SpaCy / nltk /...
|
|
|
Byte-Pair-Encoding (sentencepiece / HuggingFace...)
|
|
|
Vocabulary
|
|
|
Frequencies/counts are helpful
|
|
|
Text representations
|
|
|
Word embeddings
|
|
|
Bag of words [1,0,0,0,1]
|
|
|
Common NLP Architectures
|
|
|
Encoder-Decoder (Seq2Seq, Tree2Seq, ...)
|
|
|
Encoder
|
|
|
Siamese Network
|
|
|
Transformer
|
|
|
Attention
|
|
|
Architecture
|
|
|
PyTorch
|
|
|
Basics
|
|
|
Tensor Operations
|
|
|
Derive from nn.model, modules etc.
|
|
|
Common loss functions
|
|
|
Use PyTorch Lightning
|
|
|
Training
|
|
|
LAVIS Experimente |