These are the Juanako 7B Trained with SFT & DDP & UNA
FBL PRO
fblgit
AI & ML interests
None yet
Recent Activity
posted an update about 1 hour ago
Introducing `HarEmb - PII` a single-transformer-block distilled layer from OpenMed PII Privacy filter.
Its a very tiny model that reaches comparable results at PII classification thru viterbi BIOES decoding, harnessing 98%~ the original model performance while being a tiny fraction of the base model.
It doubles the performance tk/s, reduces the active params dramatically and the VRAM footprint.
The evaluation & benchmarking is within the model repository and can be reproduced. I trained it with an RTX4090 without issues and it is compatible with OpenMed suite and a in-place replacement for openai privacy-filter model.
https://huggingface.co/fblgit/haremb-privacy-filter-opennemo
I'm looking for people who wants to co-author/contribute/endorse HarEmb research and the technical paper for the model.
Contact xavi@juanako.ai liked a model 2 days ago
OpenMed/privacy-filter-multilingual liked a model 3 days ago
fblgit/haremb-privacy-filter-opennemo