GPT2 Classifiers
This module contains code to build a text classification model using GPT2-related model
Main classification architecture
GPT2BaseForSequenceClassification
GPT2BaseForSequenceClassification (config, is_multilabel=False, is_multihead=False, head_class_sizes=[], head_weights=[], head_class=None, **head_class_kwargs)
GPT2 Architecture for Sequence Classification task Based on: https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/modeling_gpt2.py#L1376
Type | Default | Details | |
---|---|---|---|
config | HuggingFace model configuration | ||
is_multilabel | bool | False | Whether this is a multilabel classification |
is_multihead | bool | False | Whether this is a multihead (multi-level) classification |
head_class_sizes | list | [] | Class size for each head |
head_weights | list | [] | loss weight for each head. This will be multiplied to the loss of each head’s output |
head_class | NoneType | None | The class object of the head. |
head_class_kwargs |