mirror of
https://github.com/huggingface/candle.git
synced 2025-06-18 11:37:11 +00:00
Documentation Pass for Models (#2617)
* links in chinese_clip * links for clip model * add mod docs for flux and llava * module doc for MMDIT and MIMI * add docs for a few more modesl * mod docs for bert naser and beit * add module docs for convmixer colpali codegeex and chatglm * add another series of moddocs * add fastvit-llama2_c * module docs mamba -> mobileone * module docs from moondream-phi3 * mod docs for quantized and qwen * update to yi * fix long names * Update llama2_c.rs * Update llama2_c_weights.rs * Fix the link for mimi + tweaks --------- Co-authored-by: Laurent Mazare <laurent.mazare@gmail.com>
This commit is contained in:
@ -1,5 +1,19 @@
|
||||
// T5 Text Model
|
||||
// https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/modeling_t5.py
|
||||
//! T5 model implementation.
|
||||
//!
|
||||
//! T5 (Text-to-Text Transfer Transformer) is a unified text-to-text transformer model.
|
||||
//! This implementation follows the original model architecture.
|
||||
//!
|
||||
//! Key characteristics:
|
||||
//! - Text-to-text framework
|
||||
//! - Relative positional embeddings
|
||||
//! - T5-specific layer normalization
|
||||
//! - Encoder-decoder architecture
|
||||
//! - Support for sequence-to-sequence tasks
|
||||
//!
|
||||
//! References:
|
||||
//! - [T5 Paper](https://arxiv.org/abs/1910.10683)
|
||||
//! - [HuggingFace T5](https://huggingface.co/docs/transformers/model_doc/t5)
|
||||
//! - [GH Model](https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/modeling_t5.py)
|
||||
|
||||
use crate::models::with_tracing::Embedding;
|
||||
use candle::{DType, Device, Module, Result, Tensor, D};
|
||||
|
Reference in New Issue
Block a user