whis-22/bankai-summary

This is a fine-tuned version of the T5 model for text summarization.

Model Details

  • Model type: T5
  • Language(s) (NLP): English
  • License: MIT
  • Finetuned from: t5-base

Intended Uses & Limitations

This model is intended for summarizing text content. It was fine-tuned on a custom summarization dataset.

How to Use

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

model = AutoModelForSeq2SeqLM.from_pretrained("whis-22/bankai-summary")
tokenizer = AutoTokenizer.from_pretrained("whis-22/bankai-summary")

def summarize(text, max_length=142):
    inputs = tokenizer("summarize: " + text, return_tensors="pt", max_length=512, truncation=True)
    outputs = model.generate(
        inputs["input_ids"],
        max_length=max_length,
        min_length=30,
        length_penalty=2.0,
        num_beams=4,
        early_stopping=True
    )
    return tokenizer.decode(outputs[0], skip_special_tokens=True)

# Example usage
summary = summarize("Your long text here...")
print(summary)
Downloads last month
4
Safetensors
Model size
60.5M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support