标签:HF多模态
facebook/dragon-plus-query-encoder
DRAGON+ is a BERT-base sized dense retriever initialized from RetroMAE and further trained on the data augmented from MS MARCO corpus, following th...
DeepPavlov/bert-base-cased-conversational
bert-base-cased-conversational Conversational BERT (English, cased, 12‑layer, 768‑hidden, 12‑heads, 110M parameters) was trained on the English pa...
microsoft/bloom-deepspeed-inference-fp16
This is a copy of the original BLOOM weights that is more efficient to use with the DeepSpeed-MII and DeepSpeed-Inference. In this repo the origina...
princeton-nlp/sup-simcse-roberta-large
Model Card for sup-simcse-roberta-large Model Details Model Description Developed by: Princeton-nlp Shared by [Optional]: More inf...
skt/kobert-base-v1
Please refer here. https://github.com/SKTBrain/KoBERT 收录说明: 1、本网页并非 skt/kobert-base-v1 官网网址页面,此页面内容编录于互联网,只作展...
deepset/gbert-base-germandpr-question_encoder
Overview Language model: gbert-base-germandprLanguage: GermanTraining data: GermanDPR train set (~ 56MB)Eval data: GermanDPR test set (~ 6MB)Infr...
intfloat/simlm-base-msmarco-finetuned
SimLM: Pre-training with Representation Bottleneck for Dense Passage Retrieval paper available at https://arxiv.org/pdf/2207.02578code available a...
indobenchmark/indobert-base-p2
IndoBERT Base Model (phase2 – uncased) IndoBERT is a state-of-the-art language model for Indonesian based on the BERT model. The pretrained model ...
IDEA-CCNL/Erlangshen-SimCSE-110M-Chinese
Erlangshen-SimCSE-110M-Chinese Github: Fengshenbang-LM Docs: Fengshenbang-Docs 简介 Brief Introduction 基于simcse无监督版本,用搜集整理的...
DeepPavlov/rubert-base-cased-conversational
rubert-base-cased-conversational Conversational RuBERT (Russian, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) was trained on OpenSubtit...