dataquality.loggers.logger_config.seq2seq package#
Submodules#
dataquality.loggers.logger_config.seq2seq.chat module#
- pydantic model Seq2SeqChatLoggerConfig#
Bases:
Seq2SeqLoggerConfigCreate a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Fields:
dataquality.loggers.logger_config.seq2seq.completion module#
- pydantic model Seq2SeqCompletionLoggerConfig#
Bases:
Seq2SeqLoggerConfigCreate a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Fields:
dataquality.loggers.logger_config.seq2seq.seq2seq_base module#
- pydantic model Seq2SeqLoggerConfig#
Bases:
BaseLoggerConfigCreate a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Fields:
generation_config (transformers.generation.configuration_utils.GenerationConfig | None)model (transformers.modeling_utils.PreTrainedModel | peft.peft_model.PeftModel | None)model_type (dataquality.schemas.seq2seq.Seq2SeqModelType | None)tokenizer (transformers.tokenization_utils_fast.PreTrainedTokenizerFast | None)
-
field generation_config:
Optional[GenerationConfig] = None#
-
field id_to_formatted_prompt_length:
Dict[str,Dict[int,int]] = {}#
-
field id_to_tokens:
Dict[str,Dict[int,List[int]]] = {}#
-
field max_input_tokens:
Optional[int] = None#
-
field max_target_tokens:
Optional[int] = None#
-
field model:
Union[PreTrainedModel,PeftModel,None] = None#
-
field model_type:
Optional[Seq2SeqModelType] = None#
-
field response_template:
Optional[List[int]] = None#
-
field sample_length:
Dict[str,int] = {}#
-
field tokenizer:
Optional[PreTrainedTokenizerFast] = None#