NB-BERT
Collection
Models based on BERT from Google, and trained on data from various sources, including the digital collection at the National Library of Norway. • 4 items • Updated
How to use NbAiLab/nb-bert-base with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="NbAiLab/nb-bert-base") # Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("NbAiLab/nb-bert-base", dtype="auto")NB-BERT-base is a general BERT-base model built on the large digital collection at the National Library of Norway.
This model is based on the same structure as BERT Cased multilingual model, and is trained on a wide variety of Norwegian text (both bokmål and nynorsk) from the last 200 years.
The 1.1 version of the model is general, and should be fine-tuned for any particular use. Some fine-tuning sets may be found on GitHub, see
The model is trained on a wide variety of text. The training set is described on
For more information on the model, see