Roberta Classifiers

This module contains code to build a text classification model using Roberta-related model

Classification head


source

ConcatHeadExtended

 ConcatHeadExtended (config, classifier_dropout=0.1, last_hidden_size=768,
                     layer2concat=4, num_labels=None, **kwargs)

Concatenated head for Roberta Classification Model. This head takes the last n hidden states of [CLS], and concatenate them before passing through the classifier head

Type Default Details
config HuggingFace model configuration
classifier_dropout float 0.1 Dropout ratio (for dropout layer right before the last nn.Linear)
last_hidden_size int 768 Last hidden size (before the last nn.Linear)
layer2concat int 4 number of hidden layer to concatenate (counting from top)
num_labels NoneType None Number of label output. Overwrite config.num_labels
kwargs

source

ConcatHeadSimple

 ConcatHeadSimple (config, classifier_dropout=0.1, layer2concat=4,
                   num_labels=None, **kwargs)

Concatenated head for Roberta Classification Model, the simpler version (no hidden linear layer) This head takes the last n hidden states of [CLS], and concatenate them before passing through the classifier head

Type Default Details
config HuggingFace model configuration
classifier_dropout float 0.1 Dropout ratio (for dropout layer right before the last nn.Linear)
layer2concat int 4 number of hidden layer to concatenate (counting from top)
num_labels NoneType None Number of label output. Overwrite config.num_labels
kwargs

source

RobertaClassificationHeadCustom

 RobertaClassificationHeadCustom (config, classifier_dropout=0.1,
                                  num_labels=None, **kwargs)

*Same as RobertaClassificationHead, but you can freely adjust dropout

Reference: https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/modeling_roberta.py#L1424*

Type Default Details
config HuggingFace model configuration
classifier_dropout float 0.1 Dropout ratio (for dropout layer right before the last nn.Linear)
num_labels NoneType None Number of label output. Overwrite config.num_labels
kwargs

Main classification architecture


source

RobertaBaseForSequenceClassification

 RobertaBaseForSequenceClassification (config, is_multilabel=False,
                                       is_multihead=False,
                                       head_class_sizes=[],
                                       head_weights=[], head_class=None,
                                       **head_class_kwargs)

*Base Roberta Architecture for Sequence Classification task

Based on: https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/modeling_roberta.py#L1155C35-L1155C35*

Type Default Details
config HuggingFace model configuration
is_multilabel bool False Whether this is a multilabel classification
is_multihead bool False Whether this is a multihead (multi-level) classification
head_class_sizes list [] Class size for each head
head_weights list [] loss weight for each head. This will be multiplied to the loss of each head’s output
head_class NoneType None The class object of the head. You can use RobertaClassificationHeadCustom as default
head_class_kwargs

source

RobertaHiddenStateConcatForSequenceClassification

 RobertaHiddenStateConcatForSequenceClassification (config,
                                                    layer2concat=4,
                                                    is_multilabel=False,
                                                    is_multihead=False,
                                                    head_class_sizes=[],
                                                    head_weights=[],
                                                    head_class=None,
                                                    **head_class_kwargs)

Roberta Architecture with Hidden-State-Concatenation for Sequence Classification task

Type Default Details
config HuggingFace model configuration
layer2concat int 4 number of hidden layer to concatenate (counting from top)
is_multilabel bool False Whether this is a multilabel classification
is_multihead bool False Whether this is a multihead (multi-level) classification
head_class_sizes list [] Class size for each head
head_weights list [] loss weight for each head. This will be multiplied to the loss of each head’s output
head_class NoneType None The class object of the head. You can use ConcatHeadSimple or ConcatHeadExtended
head_class_kwargs