Contains high quality implementations of Deep Reinforcement Learning algorithms written in PyTorch
-
Updated
May 19, 2021 - Jupyter Notebook
Contains high quality implementations of Deep Reinforcement Learning algorithms written in PyTorch
This Repository contains a series of google colab notebooks which I created to help people dive into deep reinforcement learning.This notebooks contain both theory and implementation of different algorithms.
Offical implementation of "Adaptive Smoothing Gradient Learning for Spiking Neural Networks", ICML 2023
Minimal implementation of the network layers of the paper "Noisy Networks for Exploration" using Pytorch.
Quantile Regression DQN implementation for bridge fleet maintenance optimization using Markov Decision Process. Migrated from C51 distributional RL (v0.8) with 200 quantiles and Huber loss. Features: Dueling architecture, Noisy Networks, PER, N-step learning. All 6 maintenance actions show positive returns with 68-78% VaR improvement.
심층강화학습기반 장애물과 신호등을 고려한 다차선 자율주행 연구 (한국통신학회 논문지 2024-6월호)
Markov Decision Process DQN with Noisy Networks for Exploration (ICLR 2018) - 21.1% performance improvement over ε-greedy.
Slide presentation reviewing advances in reinforcement learning
C51 Distributional DQN (v0.8) for bridge fleet maintenance optimization. Implements categorical return distributions (Bellemare et al., PMLR 2017) with 300x speedup via vectorized projection. Combines Noisy Networks, Dueling DQN, Double DQN, PER, and n-step learning. Validated on 200-bridge fleet: +3,173 reward in 83 min (25k episodes).
Example Noisy DQN implementation with ReLAx
Deep Reinforcement Learning containing 1) DQN 2) Double DQN 3) Dueling DQN 4) Noisy Net (Noisy DQN) 5) DQN with Prioritized Experience Replay 6) Noisy Double DQN with Prioritized Experience Replay 7) Noisy Dueling Double DQN with Prioritized Experience Replay
Add a description, image, and links to the noisy-networks topic page so that developers can more easily learn about it.
To associate your repository with the noisy-networks topic, visit your repo's landing page and select "manage topics."