|
Content:
|
|
# Content
|
|
- [Introduction](introduction) contains information about the projects, such as expectations and grading.
|
|
|
|
- Background
|
|
|
|
- [Server](background/server)
|
|
|
|
- Our Servers
|
|
|
|
List of servers
|
|
|
|
Reserving GPUs
|
|
|
|
Connecting
|
|
|
|
OpenVPN
|
|
|
|
SSH tunneling
|
|
|
|
SSH config
|
|
|
|
Setting up your environment on the server
|
|
|
|
Pip
|
|
|
|
Conda
|
|
|
|
Getting data to the server
|
|
|
|
Git (favorite)
|
|
|
|
Scp for larger files/directories
|
|
|
|
Rsync (for example with PyCharm)
|
|
|
|
Tmux
|
|
|
|
Usage (detaching, etc)
|
|
|
|
Example config
|
|
|
|
Nvidia-smi and choosing the appropriate CUDA device
|
|
|
|
Htop
|
|
|
|
|
|
|
|
|
|
* [Introduction](introduction) contains information about the projects, such as expectations and grading.
|
|
|
|
* Background/Basics
|
|
|
|
* [Server](background/server)
|
|
|
|
* Our Servers
|
|
|
|
* List of servers
|
|
|
|
* Reserving GPUs
|
|
|
|
* Connecting
|
|
|
|
* OpenVPN
|
|
|
|
* SSH tunneling
|
|
|
|
* SSH config
|
|
|
|
* Setting up your working environment on the server
|
|
|
|
* Pip
|
|
|
|
* Conda
|
|
|
|
* Getting data to the server
|
|
|
|
* Git (favorite)
|
|
|
|
* Scp for larger files/directories
|
|
|
|
* Rsync (for example with PyCharm)
|
|
|
|
* Tmux
|
|
|
|
* Usage (detaching, etc)
|
|
|
|
* Example config
|
|
|
|
* Nvidia-smi and choosing the appropriate CUDA device
|
|
|
|
* Htop
|
|
|
|
* [Natural Language Processing](background/nlp)
|
|
|
|
* Preprocessing
|
|
|
|
* Tipps
|
|
|
|
* Tokenizer
|
|
|
|
* SpaCy / nltk /...
|
|
|
|
* Byte-Pair-Encoding (sentencepiece / HuggingFace...)
|
|
|
|
* Vocabulary
|
|
|
|
* Frequencies/counts are helpful
|
|
|
|
* Text representations
|
|
|
|
* Word embeddings
|
|
|
|
* Bag of words [1,0,0,0,1]
|
|
|
|
* Common NLP Architectures
|
|
|
|
* Encoder-Decoder (Seq2Seq, Tree2Seq, ...)
|
|
|
|
* Encoder
|
|
|
|
* Siamese Network
|
|
|
|
* Transformer
|
|
|
|
* Attention
|
|
|
|
* Architecture
|
|
|
|
* [PyTorch](background/pytorch)
|
|
|
|
* Basics
|
|
|
|
* Tensor Operations
|
|
|
|
* Derive from nn.model, modules etc.
|
|
|
|
* Common loss functions
|
|
|
|
* Use PyTorch Lightning
|
|
|
|
* Training
|
|
|
|
* [LAVIS Experiment Tipps](experiment_tipps)
|
|
|
|
|
|
Deep Learning
|
|
|
|
NLP
|
|
|
|
Preprocessing
|
|
|
|
Tipps
|
|
|
|
Tokenizer
|
|
|
|
SpaCy / nltk /...
|
|
|
|
Byte-Pair-Encoding (sentencepiece / HuggingFace...)
|
|
|
|
Vocabulary
|
|
|
|
Frequencies/counts are helpful
|
|
|
|
Text representations
|
|
|
|
Word embeddings
|
|
|
|
Bag of words [1,0,0,0,1]
|
|
|
|
Common NLP Architectures
|
|
|
|
Encoder-Decoder (Seq2Seq, Tree2Seq, ...)
|
|
|
|
Encoder
|
|
|
|
Siamese Network
|
|
|
|
Transformer
|
|
|
|
Attention
|
|
|
|
Architecture
|
|
|
|
PyTorch
|
|
|
|
Basics
|
|
|
|
Tensor Operations
|
|
|
|
Derive from nn.model, modules etc.
|
|
|
|
Common loss functions
|
|
|
|
Use PyTorch Lightning
|
|
|
|
Training
|
|
|
|
LAVIS Experimente |
|
|