Update Home authored by Adrian Ulges's avatar Adrian Ulges
# Content of this Wiki # Content of this Wiki
* [Introduction](Project-Scope-and-Grading) contains information about the projects, such as expectations and grading. * [Introduction](Project-Scope-and-Grading) contains information about the projects, such as expectations and grading.
* Background/Basics * [Onboarding](Administrative Steps to On-board with your Project)
* [Server](Background/Server) * Background/Basics
* Our Servers * [Server](Background/Server)
* List of servers * Our Servers
* Reserving GPUs * List of servers
* Connecting * Reserving GPUs
* OpenVPN * Connecting
* SSH tunneling * OpenVPN
* SSH config * SSH tunneling
* Setting up your working environment on the server * SSH config
* Pip * Setting up your working environment on the server
* Conda * Pip
* Getting data to the server * Conda
* Git (favorite) * Getting data to the server
* Scp for larger files/directories * Git (favorite)
* Rsync (for example with PyCharm) * Scp for larger files/directories
* Tmux * Rsync (for example with PyCharm)
* Usage (detaching, etc) * Tmux
* Example config * Usage (detaching, etc)
* Nvidia-smi and choosing the appropriate CUDA device * Example config
* Htop * Nvidia-smi and choosing the appropriate CUDA device
* [Natural Language Processing](Background/NLP) * Htop
* Preprocessing * [Natural Language Processing](Background/NLP)
* Tipps * Preprocessing
* Tokenizer * Tipps
* SpaCy / nltk /... * Tokenizer
* Byte-Pair-Encoding (sentencepiece / HuggingFace...) * SpaCy / nltk /...
* Vocabulary * Byte-Pair-Encoding (sentencepiece / HuggingFace...)
* Frequencies/counts are helpful * Vocabulary
* Text representations * Frequencies/counts are helpful
* Word embeddings * Text representations
* Bag of words [1,0,0,0,1] * Word embeddings
* Common NLP Architectures * Bag of words [1,0,0,0,1]
* Encoder-Decoder (Seq2Seq, Tree2Seq, ...) * Common NLP Architectures
* Encoder * Encoder-Decoder (Seq2Seq, Tree2Seq, ...)
* Siamese Network * Encoder
* Transformer * Siamese Network
* Attention * Transformer
* Architecture * Attention
* [PyTorch](Background/PyTorch) * Architecture
* Basics * [PyTorch](Background/PyTorch)
* Tensor Operations * Basics
* Derive from nn.model, modules etc. * Tensor Operations
* Common loss functions * Derive from nn.model, modules etc.
* Use PyTorch Lightning * Common loss functions
* Training * Use PyTorch Lightning
* [LAVIS Experiment Tipps](Experiments) * Training
* [Some very useful Bash Commands](Useful Bash Commands) * [LAVIS Experiment Tipps](Experiments)
* [Conference Calendar](Conference Calendar) * [Some very useful Bash Commands](Useful Bash Commands)
* [Conference Calendar](Conference Calendar)