Seeing as the authors claim that KANs are able to reduce the issues of catastrophic forgetting that we see in MLPs, I thought "Wouldn't it be nice if there was an LLM that substituted MLPs with KANs?". I looked around and didn't find one, so I built one!<p>- PyTorch Module of the kan_gpt<p>- Deployed to PyPi<p>- MIT Licence<p>- Test Cases to ensure forward-backward passes work as expected<p>- Training script<p>I am currently working on training it on the WebText dataset to compare it to the original gpt2. Facing a few out-of-memory issues at the moment. Perhaps the vocab size (50257) is too large?<p>I'm open to contributions and would love to hear your thoughts!