Instructions to use neulab/codebert-javascript with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use neulab/codebert-javascript with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="neulab/codebert-javascript")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("neulab/codebert-javascript") model = AutoModelForMaskedLM.from_pretrained("neulab/codebert-javascript") - Notebooks
- Google Colab
- Kaggle
Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
This is a `microsoft/codebert-base-mlm` model, trained for 1,000,000 steps (with `batch_size=32`) on **JavaScript** code from the `codeparrot/github-code-clean` dataset, on the masked-language-modeling task.
|
| 2 |
+
|
| 3 |
+
It is intended to be used in CodeBERTScore: [https://github.com/neulab/code-bert-score](https://github.com/neulab/code-bert-score), but can be used for any other model or task.
|
| 4 |
+
|
| 5 |
+
For more information, see: [https://github.com/neulab/code-bert-score](https://github.com/neulab/code-bert-score)
|