TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: Autocomplete Python Code with Transformers

3 pointsby vpjover 4 years ago
This is a small project we created to train a character level autoregressive transformer (or LSTM) model to predict Python source code. We trained it on GitHub repositories found on awesome pytorch list.<p>Github repo: https:&#x2F;&#x2F;github.com&#x2F;lab-ml&#x2F;python_autocomplete<p>You can try training on Google Colab: https:&#x2F;&#x2F;colab.research.google.com&#x2F;github&#x2F;lab-ml&#x2F;python_autocomplete&#x2F;blob&#x2F;master&#x2F;notebooks&#x2F;train.ipynb<p>Here are some sample evaluations&#x2F;visualizations of the trained model: https:&#x2F;&#x2F;colab.research.google.com&#x2F;github&#x2F;lab-ml&#x2F;python_autocomplete&#x2F;blob&#x2F;master&#x2F;notebooks&#x2F;evaluate.ipynb<p>Working on a simple VSCode extension to test this out. Will open source it soon on the same repository.

2 comments

vpjover 4 years ago
Links:<p>Github repo: <a href="https:&#x2F;&#x2F;github.com&#x2F;lab-ml&#x2F;python_autocomplete" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;lab-ml&#x2F;python_autocomplete</a><p>Training notebook: <a href="https:&#x2F;&#x2F;colab.research.google.com&#x2F;github&#x2F;lab-ml&#x2F;python_autocomplete&#x2F;blob&#x2F;master&#x2F;notebooks&#x2F;train.ipynb" rel="nofollow">https:&#x2F;&#x2F;colab.research.google.com&#x2F;github&#x2F;lab-ml&#x2F;python_autoc...</a><p>Evaluation notebook: <a href="https:&#x2F;&#x2F;colab.research.google.com&#x2F;github&#x2F;lab-ml&#x2F;python_autocomplete&#x2F;blob&#x2F;master&#x2F;notebooks&#x2F;evaluate.ipynb" rel="nofollow">https:&#x2F;&#x2F;colab.research.google.com&#x2F;github&#x2F;lab-ml&#x2F;python_autoc...</a>
alexanderrofailover 4 years ago
<a href="https:&#x2F;&#x2F;www.kite.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.kite.com&#x2F;</a> does this too, claiming 47% less keystrokes. They are a good benchmark to set for seeing how efficient the model can get.