hezar.models.backbone.roberta.roberta module¶
RoBERTa base language model (HuggingFace Transformers) wrapped by a Hezar Model class
- class hezar.models.backbone.roberta.roberta.RoBERTa(config, **kwargs)[source]¶
- Bases: - Model- forward(token_ids, attention_mask=None, token_type_ids=None, position_ids=None, head_mask=None, inputs_embeds=None, encoder_hidden_states=None, encoder_attention_mask=None, past_key_values=None, use_cache=None, output_attentions=None, output_hidden_states=None, **kwargs)[source]¶
- Forward inputs through the model and return logits, etc. - Parameters:
- model_inputs – The required inputs for the model forward 
- Returns:
- A dict of outputs like logits, loss, etc. 
 
 - post_process(model_outputs, **kwargs)[source]¶
- Process model outputs and return human-readable results. Called in self.predict() - Parameters:
- model_outputs – model outputs to process 
- **kwargs – extra arguments specific to the derived class 
 
- Returns:
- Processed model output values and converted to human-readable results 
 
 - preprocess(inputs: str | List[str], **kwargs)[source]¶
- Given raw inputs, preprocess the inputs and prepare them for model’s forward(). - Parameters:
- raw_inputs – Raw model inputs 
- **kwargs – Extra kwargs specific to the model. See the model’s specific class for more info 
 
- Returns:
- A dict of inputs for model forward 
 
 - skip_keys_on_load = ['model.embeddings.position_ids', 'roberta.embeddings.position_ids']¶