dataquality.loggers.logger_config.seq2seq package#

Submodules#

dataquality.loggers.logger_config.seq2seq.chat module#

pydantic model Seq2SeqChatLoggerConfig#

Bases: Seq2SeqLoggerConfig

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Fields:

dataquality.loggers.logger_config.seq2seq.completion module#

pydantic model Seq2SeqCompletionLoggerConfig#

Bases: Seq2SeqLoggerConfig

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Fields:

dataquality.loggers.logger_config.seq2seq.seq2seq_base module#

pydantic model Seq2SeqLoggerConfig#

Bases: BaseLoggerConfig

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Fields:
field generation_config: Optional[GenerationConfig] = None#
field generation_splits: Set[Split] = {}#
field id_to_formatted_prompt_length: Dict[str, Dict[int, int]] = {}#
field id_to_tokens: Dict[str, Dict[int, List[int]]] = {}#
field max_input_tokens: Optional[int] = None#
field max_target_tokens: Optional[int] = None#
field model: Union[PreTrainedModel, PeftModel, None] = None#
field model_type: Optional[Seq2SeqModelType] = None#
field response_template: Optional[List[int]] = None#
field sample_length: Dict[str, int] = {}#
field tokenizer: Optional[PreTrainedTokenizerFast] = None#

Module contents#