dataclass_wizard package¶
Subpackages¶
- dataclass_wizard.environ package
- Submodules
- dataclass_wizard.environ.dumpers module
- dataclass_wizard.environ.loaders module
EnvLoader
EnvLoader.load_func_for_dataclass()
EnvLoader.load_to_byte_array()
EnvLoader.load_to_bytes()
EnvLoader.load_to_date()
EnvLoader.load_to_datetime()
EnvLoader.load_to_defaultdict()
EnvLoader.load_to_dict()
EnvLoader.load_to_iterable()
EnvLoader.load_to_named_tuple()
EnvLoader.load_to_named_tuple_untyped()
EnvLoader.load_to_tuple()
EnvLoader.load_to_typed_dict()
EnvLoader.load_to_uuid()
- dataclass_wizard.environ.lookups module
- dataclass_wizard.environ.wizard module
- Module contents
- dataclass_wizard.utils package
- Submodules
- dataclass_wizard.utils.dataclass_compat module
- dataclass_wizard.utils.dict_helper module
- dataclass_wizard.utils.function_builder module
FunctionBuilder
FunctionBuilder.add_line()
FunctionBuilder.add_lines()
FunctionBuilder.break_()
FunctionBuilder.create_functions()
FunctionBuilder.current_function
FunctionBuilder.decrease_indent()
FunctionBuilder.elif_()
FunctionBuilder.else_()
FunctionBuilder.except_()
FunctionBuilder.except_multi()
FunctionBuilder.finalize_function()
FunctionBuilder.for_()
FunctionBuilder.function()
FunctionBuilder.functions
FunctionBuilder.globals
FunctionBuilder.if_()
FunctionBuilder.increase_indent()
FunctionBuilder.indent_level
FunctionBuilder.namespace
FunctionBuilder.prev_function
FunctionBuilder.try_()
is_builtin_class()
- dataclass_wizard.utils.json_util module
- dataclass_wizard.utils.lazy_loader module
- dataclass_wizard.utils.object_path module
- dataclass_wizard.utils.string_conv module
- dataclass_wizard.utils.type_conv module
- dataclass_wizard.utils.typing_compat module
- dataclass_wizard.utils.wrappers module
- Module contents
- dataclass_wizard.v1 package
- Submodules
- dataclass_wizard.v1.decorators module
- dataclass_wizard.v1.enums module
- dataclass_wizard.v1.loaders module
LoadMixin
LoadMixin.default_load_to()
LoadMixin.get_string_for_annotation()
LoadMixin.load_to_bool()
LoadMixin.load_to_bytearray()
LoadMixin.load_to_bytes()
LoadMixin.load_to_dataclass()
LoadMixin.load_to_date()
LoadMixin.load_to_datetime()
LoadMixin.load_to_decimal()
LoadMixin.load_to_defaultdict()
LoadMixin.load_to_dict()
LoadMixin.load_to_enum()
LoadMixin.load_to_float()
LoadMixin.load_to_int()
LoadMixin.load_to_iterable()
LoadMixin.load_to_literal()
LoadMixin.load_to_named_tuple()
LoadMixin.load_to_named_tuple_untyped()
LoadMixin.load_to_none()
LoadMixin.load_to_path()
LoadMixin.load_to_str()
LoadMixin.load_to_time()
LoadMixin.load_to_timedelta()
LoadMixin.load_to_tuple()
LoadMixin.load_to_typed_dict()
LoadMixin.load_to_union()
LoadMixin.load_to_uuid()
LoadMixin.transform_json_field
check_and_raise_missing_fields()
generate_field_code()
load_func_for_dataclass()
re_raise()
setup_default_loader()
- dataclass_wizard.v1.models module
Alias()
AliasPath()
Extras
Field
PatternBase
TypeInfo
TypeInfo.args
TypeInfo.ensure_in_locals()
TypeInfo.field_i
TypeInfo.i
TypeInfo.in_optional
TypeInfo.index
TypeInfo.multi_wrap()
TypeInfo.name
TypeInfo.origin
TypeInfo.prefix
TypeInfo.replace()
TypeInfo.type_name()
TypeInfo.v()
TypeInfo.v_and_next()
TypeInfo.v_and_next_k_v()
TypeInfo.wrap()
TypeInfo.wrap_builtin()
TypeInfo.wrap_dd()
- Module contents
- dataclass_wizard.wizard_cli package
Submodules¶
dataclass_wizard.abstractions module¶
Contains implementations for Abstract Base Classes
- class dataclass_wizard.abstractions.AbstractEnvWizard[source]¶
Bases:
ABC
Abstract class that defines the methods a sub-class must implement at a minimum to be considered a “true” Environment Wizard.
- class dataclass_wizard.abstractions.AbstractJSONWizard[source]¶
Bases:
ABC
- class dataclass_wizard.abstractions.AbstractLoader[source]¶
Bases:
ABC
- class dataclass_wizard.abstractions.AbstractLoaderGenerator[source]¶
Bases:
ABC
Abstract code generator which defines helper methods to generate the code for deserializing an object o of a given annotated type into the corresponding dataclass field during dynamic function construction.
- abstract static default_load_to(tp, extras)[source]¶
Generate code for the default load function if no other types match. Generally, this will be a stub load method.
- abstract classmethod get_string_for_annotation(tp, extras)[source]¶
Generate code to get the parser (dispatcher) for a given annotation type.
base_cls is the original class object, useful when the annotated type is a
typing.ForwardRef
object.
- abstract static load_to_bool(_, extras)[source]¶
Generate code to load a value into a boolean field. Adds a helper function as_bool to the local context.
- Return type:
str
- Parameters:
_ (str)
extras (Extras)
- abstract static load_to_bytearray(tp, extras)[source]¶
Generate code to load a value into a bytearray field.
- abstract static load_to_bytes(tp, extras)[source]¶
Generate code to load a value into a bytes field.
- static load_to_dataclass(tp, extras)[source]¶
Generate code to load a value into a dataclass type field.
- abstract static load_to_datetime(tp, extras)[source]¶
Generate code to load a value into a datetime field.
- abstract static load_to_decimal(tp, extras)[source]¶
Generate code to load a value into a Decimal field.
- abstract static load_to_defaultdict(tp, extras)[source]¶
Generate code to load a value into a defaultdict field.
- abstract static load_to_dict(tp, extras)[source]¶
Generate code to load a value into a dictionary field.
- abstract static load_to_float(tp, extras)[source]¶
Generate code to load a value into a float field.
- abstract static load_to_int(tp, extras)[source]¶
Generate code to load a value into an integer field.
- abstract static load_to_iterable(tp, extras)[source]¶
Generate code to load a value into an iterable field (list, set, etc.).
- abstract static load_to_literal(tp, extras)[source]¶
Generate code to confirm a value is equivalent to one of the provided literals.
- abstract static load_to_named_tuple(tp, extras)[source]¶
Generate code to load a value into a named tuple field.
- abstract classmethod load_to_named_tuple_untyped(tp, extras)[source]¶
Generate code to load a value into an untyped named tuple.
- abstract static load_to_path(tp, extras)[source]¶
Generate code to load a value into a Decimal field.
- abstract static load_to_timedelta(tp, extras)[source]¶
Generate code to load a value into a timedelta field.
- abstract static load_to_tuple(tp, extras)[source]¶
Generate code to load a value into a tuple field.
- abstract static load_to_typed_dict(tp, extras)[source]¶
Generate code to load a value into a typed dictionary field.
- abstract classmethod load_to_union(tp, extras)[source]¶
Generate code to load a value into a Union[X, Y, …] (one of [X, Y, …] possible types)
dataclass_wizard.bases module¶
- class dataclass_wizard.bases.ABCOrAndMeta(name, bases, namespace, /, **kwargs)[source]¶
Bases:
ABCMeta
Metaclass to add class-level
__or__()
and__and__()
methods to a base class of typeM
.
- class dataclass_wizard.bases.AbstractEnvMeta[source]¶
Bases:
object
Base class definition for the EnvWizard.Meta inner class.
- all_fields = frozenset({'debug_enabled', 'env_file', 'env_prefix', 'field_to_env_var', 'key_lookup_with_load', 'key_transform_with_dump', 'recursive', 'secrets_dir', 'skip_defaults', 'skip_defaults_if', 'skip_if'})¶
- abstract classmethod bind_to(env_class, create=True, is_default=True)[source]¶
Initialize hook which applies the Meta config to env_class, which is typically a subclass of
EnvWizard
.- Parameters:
env_class (
Type
) – A sub-class ofEnvWizard
.create – When true, a separate loader/dumper will be created for the class. If disabled, this will access the root loader/dumper, so modifying this should affect global settings across all dataclasses that use the JSON load/dump process.
is_default – When enabled, the Meta will be cached as the default Meta config for the dataclass. Defaults to true.
-
debug_enabled:
ClassVar
[bool
] = False¶
-
env_file:
ClassVar
[Union
[bool
,str
,bytes
,PathLike
,int
,Iterable
[Union
[str
,bytes
,PathLike
,int
]],None
]] = None¶
-
env_prefix:
ClassVar
[str
] = None¶
-
field_to_env_var:
ClassVar
[Dict
[str
,str
]] = None¶
- fields_to_merge = frozenset({'env_file', 'env_prefix', 'field_to_env_var', 'key_lookup_with_load', 'key_transform_with_dump', 'recursive', 'secrets_dir', 'skip_defaults', 'skip_defaults_if', 'skip_if'})¶
-
key_lookup_with_load:
ClassVar
[Union
[LetterCasePriority
,str
]] = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
-
key_transform_with_dump:
ClassVar
[Union
[LetterCase
,str
]] = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
-
recursive:
ClassVar
[bool
] = True¶
-
secrets_dir:
ClassVar
[EnvFileType | Sequence[EnvFileType]] = None¶
-
skip_defaults:
ClassVar
[bool
] = False¶
- class dataclass_wizard.bases.AbstractMeta[source]¶
Bases:
object
Base class definition for the JSONWizard.Meta inner class.
- all_fields = frozenset({'auto_assign_tags', 'debug_enabled', 'json_key_to_field', 'key_transform_with_dump', 'key_transform_with_load', 'marshal_date_time_as', 'raise_on_unknown_json_key', 'recursive', 'recursive_classes', 'skip_defaults', 'skip_defaults_if', 'skip_if', 'tag', 'tag_key', 'v1', 'v1_debug', 'v1_field_to_alias', 'v1_key_case', 'v1_on_unknown_key', 'v1_unsafe_parse_dataclass_in_union'})¶
-
auto_assign_tags:
ClassVar
[bool
] = False¶
- abstract classmethod bind_to(dataclass, create=True, is_default=True)[source]¶
Initialize hook which applies the Meta config to dataclass, which is typically a subclass of
JSONWizard
.- Parameters:
dataclass (
Type
) – A class which has been decorated by the @dataclass decorator; typically this is a sub-class ofJSONWizard
.create – When true, a separate loader/dumper will be created for the class. If disabled, this will access the root loader/dumper, so modifying this should affect global settings across all dataclasses that use the JSON load/dump process.
is_default – When enabled, the Meta will be cached as the default Meta config for the dataclass. Defaults to true.
-
debug_enabled:
ClassVar
[bool
|int
|str
] = False¶
- fields_to_merge = frozenset({'auto_assign_tags', 'debug_enabled', 'key_transform_with_dump', 'key_transform_with_load', 'marshal_date_time_as', 'raise_on_unknown_json_key', 'recursive_classes', 'skip_defaults', 'skip_defaults_if', 'skip_if', 'tag_key', 'v1', 'v1_debug', 'v1_key_case', 'v1_on_unknown_key', 'v1_unsafe_parse_dataclass_in_union'})¶
-
json_key_to_field:
ClassVar
[Dict
[str
,str
]] = None¶
-
key_transform_with_dump:
ClassVar
[Union
[LetterCase
,str
]] = None¶
-
key_transform_with_load:
ClassVar
[Union
[LetterCase
,str
]] = None¶
-
marshal_date_time_as:
ClassVar
[Union
[DateTimeTo
,str
]] = None¶
-
raise_on_unknown_json_key:
ClassVar
[bool
] = False¶
-
recursive:
ClassVar
[bool
] = True¶
-
recursive_classes:
ClassVar
[bool
] = False¶
-
skip_defaults:
ClassVar
[bool
] = False¶
-
tag:
ClassVar
[str
] = None¶
-
tag_key:
ClassVar
[str
] = '__tag__'¶
-
v1:
ClassVar
[bool
] = False¶
-
v1_debug:
ClassVar
[bool
|int
|str
] = False¶
-
v1_field_to_alias:
ClassVar
[Dict
[str
,str
]] = None¶
-
v1_unsafe_parse_dataclass_in_union:
ClassVar
[bool
] = False¶
- class dataclass_wizard.bases.BaseDumpHook[source]¶
Bases:
object
Container class for type hooks.
dataclass_wizard.bases_meta module¶
Ideally should be in the bases module, however we’ll run into a Circular Import scenario if we move it there, since the loaders and dumpers modules both import directly from bases.
- class dataclass_wizard.bases_meta.BaseEnvWizardMeta[source]¶
Bases:
AbstractEnvMeta
Superclass definition for the EnvWizard.Meta inner class.
See the implementation of the
AbstractEnvMeta
class for the available config that can be set, as well as for descriptions on any implemented methods.- classmethod bind_to(env_class, create=True, is_default=True)[source]¶
Initialize hook which applies the Meta config to env_class, which is typically a subclass of
EnvWizard
.- Parameters:
env_class (
type
) – A sub-class ofEnvWizard
.create – When true, a separate loader/dumper will be created for the class. If disabled, this will access the root loader/dumper, so modifying this should affect global settings across all dataclasses that use the JSON load/dump process.
is_default – When enabled, the Meta will be cached as the default Meta config for the dataclass. Defaults to true.
- class dataclass_wizard.bases_meta.BaseJSONWizardMeta[source]¶
Bases:
AbstractMeta
Superclass definition for the JSONWizard.Meta inner class.
See the implementation of the
AbstractMeta
class for the available config that can be set, as well as for descriptions on any implemented methods.- classmethod bind_to(dataclass, create=True, is_default=True, base_loader=None)[source]¶
Initialize hook which applies the Meta config to dataclass, which is typically a subclass of
JSONWizard
.- Parameters:
dataclass (
type
) – A class which has been decorated by the @dataclass decorator; typically this is a sub-class ofJSONWizard
.create – When true, a separate loader/dumper will be created for the class. If disabled, this will access the root loader/dumper, so modifying this should affect global settings across all dataclasses that use the JSON load/dump process.
is_default – When enabled, the Meta will be cached as the default Meta config for the dataclass. Defaults to true.
- dataclass_wizard.bases_meta.DumpMeta(**kwargs)[source]¶
Helper function to setup the
Meta
Config for the JSON dump (serialization) process, which is intended for use alongside theasdict
helper function.For descriptions on what each of these params does, refer to the Docs below, or check out the
AbstractMeta
definition (I want to avoid duplicating the descriptions for params here).Examples:
>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass) >>> asdict(MyClass, {"myStr": "value"})
- Return type:
Type
[TypeVar
(META_
, bound= AbstractMeta)]
- dataclass_wizard.bases_meta.EnvMeta(**kwargs)[source]¶
Helper function to setup the
Meta
Config for the EnvWizard.For descriptions on what each of these params does, refer to the Docs below, or check out the
AbstractEnvMeta
definition (I want to avoid duplicating the descriptions for params here).Examples:
>>> EnvMeta(key_transform_with_dump='SNAKE').bind_to(MyClass)
- Return type:
Type
[TypeVar
(META_
, bound= AbstractMeta)]
- dataclass_wizard.bases_meta.LoadMeta(**kwargs)[source]¶
Helper function to setup the
Meta
Config for the JSON load (de-serialization) process, which is intended for use alongside thefromdict
helper function.For descriptions on what each of these params does, refer to the Docs below, or check out the
AbstractMeta
definition (I want to avoid duplicating the descriptions for params here).Examples:
>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass) >>> fromdict(MyClass, {"myStr": "value"})
- Return type:
Type
[TypeVar
(META_
, bound= AbstractMeta)]
dataclass_wizard.class_helper module¶
- dataclass_wizard.class_helper.call_meta_initializer_if_needed(cls, package_name='dataclass_wizard')[source]¶
Calls the Meta initializer when the inner
Meta
is sub-classed.
- dataclass_wizard.class_helper.create_meta(cls, cls_name=None, **kwargs)[source]¶
Sets the Meta config for the
AbstractJSONWizard
subclass.- WARNING: Only use if the Meta config is undefined,
e.g. get_meta for the cls returns base_cls.
- dataclass_wizard.class_helper.create_new_class(class_or_instance, bases, suffix=None, attr_dict=None)[source]¶
- dataclass_wizard.class_helper.dataclass_field_to_load_parser(cls_loader, cls, config, save=True)[source]¶
- dataclass_wizard.class_helper.field_to_env_var(cls)[source]¶
Returns a mapping of field in the EnvWizard subclass to env variable.
dataclass_wizard.constants module¶
dataclass_wizard.decorators module¶
- class dataclass_wizard.decorators.cached_class_property(func)[source]¶
Bases:
object
Descriptor decorator implementing a class-level, read-only property, which caches the attribute on-demand on the first use.
- class dataclass_wizard.decorators.cached_property(func)[source]¶
Bases:
object
Descriptor decorator implementing an instance-level, read-only property, which caches the attribute on-demand on the first use.
- dataclass_wizard.decorators.resolve_alias_func(f, _locals=None, raise_=False)[source]¶
Resolve the underlying single-arg alias function for f, using the provided function locals (which will be a dict). If f does not have an associated alias function, we return f itself.
- Raises:
AttributeError – If raise_ is true and f is not a single-arg alias function.
- Return type:
Callable
- Parameters:
f (Callable)
_locals (Dict)
- dataclass_wizard.decorators.try_with_load(load_fn)[source]¶
Try to call a load hook, catch and re-raise errors as a ParseError.
Note: this function will be recursively called on all load hooks for a dataclass, when debug_mode is enabled for the dataclass.
- Parameters:
load_fn (
Callable
) – The load hook, can be a regular callable, a single-arg alias, or an identity function.- Returns:
The decorated load hook.
- dataclass_wizard.decorators.try_with_load_with_single_arg(original_fn, single_arg_load_fn, base_type)[source]¶
Similar to
try_with_load()
, but for single-arg alias functions.- Parameters:
original_fn (
Callable
) – The original load hook (function)single_arg_load_fn (
Callable
) – The single-argument load hookbase_type (
Type
) – The annotated (or desired) type
- Returns:
The decorated load hook.
dataclass_wizard.dumpers module¶
The implementation below uses code adapted from the asdict helper function from the library Dataclasses (https://github.com/ericvsmith/dataclasses).
This library is available under the Apache 2.0 license, which can be obtained from http://www.apache.org/licenses/LICENSE-2.0.
See the end of this file for the original Apache license from this library.
- class dataclass_wizard.dumpers.DumpMixin[source]¶
Bases:
AbstractDumper
,BaseDumpHook
This Mixin class derives its name from the eponymous json.dumps function. Essentially it contains helper methods to convert Python built-in types to a more ‘JSON-friendly’ version.
- static transform_dataclass_field()¶
Convert a string to Camel Case.
Examples:
>>> to_camel_case("device_type") 'deviceType'
- Return type:
str
- Parameters:
string (str)
- dataclass_wizard.dumpers.asdict(o, *, cls=None, dict_factory=<class 'dict'>, exclude=None, **kwargs)[source]¶
Return the fields of a dataclass instance as a new dictionary mapping field names to field values.
Example usage: :rtype:
dict
[str
,Any
]@dataclass class C:
x: int y: int
c = C(1, 2) assert asdict(c) == {‘x’: 1, ‘y’: 2}
When directly invoking this function, an optional Meta configuration for the dataclass can be specified via
DumpMeta
; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass) >>> asdict(MyClass(my_str="value"))
If given, ‘dict_factory’ will be used instead of built-in dict. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts.
- Parameters:
o (T)
exclude (Collection[str] | None)
- Return type:
dict[str, Any]
- dataclass_wizard.dumpers.dump_func_for_dataclass(cls, config=None, nested_cls_to_dump_func=None)[source]¶
- Return type:
Callable
[[TypeVar
(T
),Any
,Any
,Any
],dict
[str
,Any
]]- Parameters:
cls (Type[T])
config (Type[META_] | None)
nested_cls_to_dump_func (Dict[Type, Any])
- dataclass_wizard.dumpers.get_dumper(cls=None, create=True)[source]¶
Get the dumper for the class, using the following logic: :rtype:
Type
[DumpMixin
]Return the class if it’s already a sub-class of
DumpMixin
If create is enabled (which is the default), a new sub-class of
DumpMixin
for the class will be generated and cached on the initial run.Otherwise, we will return the base dumper,
DumpMixin
, which can potentially be shared by more than one dataclass.
- Return type:
Type[DumpMixin]
dataclass_wizard.enums module¶
Re-usable Enum definitions
- class dataclass_wizard.enums.DateTimeTo(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)[source]¶
Bases:
Enum
- ISO_FORMAT = 0¶
- TIMESTAMP = 1¶
- class dataclass_wizard.enums.LetterCase(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)[source]¶
Bases:
Enum
- CAMEL = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
- LISP = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
- NONE = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
- PASCAL = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
- SNAKE = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
- class dataclass_wizard.enums.LetterCasePriority(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)[source]¶
Bases:
Enum
Helper Enum which determines which letter casing we want to prioritize when loading environment variable names.
The default
- CAMEL = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
- PASCAL = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
- SCREAMING_SNAKE = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
- SNAKE = <dataclass_wizard.utils.wrappers.FuncWrapper object>¶
dataclass_wizard.errors module¶
- exception dataclass_wizard.errors.ExtraData(cls, extra_kwargs, field_names)[source]¶
Bases:
JSONWizardError
Error raised when extra keyword arguments are passed in to the constructor or __init__() method of an EnvWizard subclass.
Note that this error class is raised by default, unless a value for the extra field is specified in the
Meta
class.- Parameters:
cls (Type)
extra_kwargs (Collection[str])
field_names (Collection[str])
- property message: str¶
Format and return an error message.
- exception dataclass_wizard.errors.InvalidConditionError(cls, field_name)[source]¶
Bases:
JSONWizardError
Error raised when a condition is not wrapped in
SkipIf
.- Parameters:
cls (Type)
field_name (str)
- property message: str¶
Format and return an error message.
- exception dataclass_wizard.errors.JSONWizardError[source]¶
Bases:
ABC
,Exception
Base error class, for errors raised by this library.
- property class_name: str | None¶
- abstract property message: str¶
Format and return an error message.
- property parent_cls: type | None¶
- exception dataclass_wizard.errors.MissingData(nested_cls, **kwargs)[source]¶
Bases:
ParseError
Error raised when unable to create a class instance, as the JSON object is None.
- Parameters:
nested_cls (Type)
- property message: str¶
Format and return an error message.
- exception dataclass_wizard.errors.MissingFields(base_err, obj, cls, cls_fields, cls_kwargs=None, missing_fields=None, missing_keys=None, **kwargs)[source]¶
Bases:
JSONWizardError
Error raised when unable to create a class instance (most likely due to missing arguments)
- Parameters:
base_err (Exception | None)
obj (Dict[str, Any])
cls (Type)
cls_fields (Tuple[Field, ...])
cls_kwargs (JSONObject | None)
missing_fields (Collection[str] | None)
missing_keys (Collection[str] | None)
- property message: str¶
Format and return an error message.
- exception dataclass_wizard.errors.MissingVars(cls, missing_vars)[source]¶
Bases:
JSONWizardError
Error raised when unable to create an instance of a EnvWizard subclass (most likely due to missing environment variables in the Environment)
- Parameters:
cls (Type)
missing_vars (Sequence[Tuple[str, str | None, str, Any]])
- property message: str¶
Format and return an error message.
- exception dataclass_wizard.errors.ParseError(base_err, obj, ann_type, _default_class=None, _field_name=None, _json_object=None, **kwargs)[source]¶
Bases:
JSONWizardError
Base error when an error occurs during the JSON load process.
- Parameters:
base_err (Exception)
obj (Any)
ann_type (Type | Iterable | None)
_default_class (type | None)
_field_name (str | None)
_json_object (Any)
- property field_name: str | None¶
- property json_object¶
- property message: str¶
Format and return an error message.
- exception dataclass_wizard.errors.RecursiveClassError(cls)[source]¶
Bases:
JSONWizardError
Error raised when we encounter a RecursionError due to cyclic or self-referential dataclasses.
- Parameters:
cls (Type)
- property message: str¶
Format and return an error message.
- dataclass_wizard.errors.UnknownJSONKey¶
alias of
UnknownKeysError
- exception dataclass_wizard.errors.UnknownKeysError(unknown_keys, obj, cls, cls_fields, **kwargs)[source]¶
Bases:
JSONWizardError
Error raised when unknown JSON key(s) are encountered in the JSON load process.
Note that this error class is only raised when the raise_on_unknown_json_key flag is enabled in the
Meta
class.- Parameters:
unknown_keys (list[str] | str)
obj (Dict[str, Any])
cls (Type)
cls_fields (Tuple[Field, ...])
- property json_key¶
- property message: str¶
Format and return an error message.
- dataclass_wizard.errors.show_deprecation_warning(fn, reason, fmt='Deprecated function {name} ({reason}).')[source]¶
Display a deprecation warning for a given function.
@param fn: Function which is deprecated. @param reason: Reason for the deprecation. @param fmt: Format string for the name/reason.
- Return type:
None
- Parameters:
fn (Callable | str)
reason (str)
fmt (str)
dataclass_wizard.lazy_imports module¶
Lazy Import definitions. Generally, these imports will be available when any “bonus features” are installed, i.e. as below:
$ pip install dataclass-wizard[timedelta]
dataclass_wizard.loader_selection module¶
- dataclass_wizard.loader_selection.fromdict(cls, d)[source]¶
Converts a Python dictionary object to a dataclass instance.
Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.
When directly invoking this function, an optional Meta configuration for the dataclass can be specified via
LoadMeta
; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass) >>> fromdict(MyClass, {"myStr": "value"})
- Return type:
TypeVar
(T
)- Parameters:
cls (type[T])
d (dict[str, Any])
- dataclass_wizard.loader_selection.fromlist(cls, list_of_dict)[source]¶
Converts a Python list object to a list of dataclass instances.
Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.
- Return type:
list
[TypeVar
(T
)]- Parameters:
cls (type[T])
list_of_dict (list[dict[str, Any]])
- dataclass_wizard.loader_selection.get_loader(class_or_instance=None, create=True, base_cls=None, v1=None)[source]¶
Get the loader for the class, using the following logic: :rtype:
type
[TypeVar
(T
)]Return the class if it’s already a sub-class of
LoadMixin
If create is enabled (which is the default), a new sub-class of
LoadMixin
for the class will be generated and cached on the initial run.Otherwise, we will return the base loader,
LoadMixin
, which can potentially be shared by more than one dataclass.
- Parameters:
base_cls (T)
v1 (bool | None)
- Return type:
type[T]
dataclass_wizard.loaders module¶
- class dataclass_wizard.loaders.LoadMixin[source]¶
Bases:
AbstractLoader
,BaseLoadHook
This Mixin class derives its name from the eponymous json.loads function. Essentially it contains helper methods to convert JSON strings (or a Python dictionary object) to a dataclass which can often contain complex types such as lists, dicts, or even other dataclasses nested within it.
Refer to the
AbstractLoader
class for documentation on any of the implemented methods.- classmethod get_parser_for_annotation(ann_type, base_cls=None, extras=None)[source]¶
Returns the Parser (dispatcher) for a given annotation type.
- Return type:
Union
[AbstractParser
,Callable
[[dict
[str
,Any
]],TypeVar
(T
)]]- Parameters:
ann_type (Type[T])
base_cls (Type)
extras (Extras)
- static load_after_type_check(o, base_type)[source]¶
- Return type:
TypeVar
(T
)- Parameters:
o (Any)
base_type (Type[T])
- static load_func_for_dataclass(cls, config)[source]¶
- Return type:
Callable
[[dict
[str
,Any
]],TypeVar
(T
)]- Parameters:
cls (Type[T])
config (Type[META_] | None)
- static load_to_bool(o, _)[source]¶
- Return type:
bool
- Parameters:
o (str | bool | int | float)
_ (Type[bool])
- static load_to_date(base_type=<class 'datetime.date'>, default=None, raise_=True)¶
Attempt to convert an object o to a
date
object using the below logic.str
: convert date strings (in ISO format) via the built-infromisoformat
method.Number
(int or float): Convert a numeric timestamp via thebuilt-in
fromtimestamp
method.
date
: Return object o if it’s already of this type orsub-type.
Otherwise, if we’re unable to convert the value of o to a
date
as expected, raise an error if the raise_ parameter is true; if not, return default instead.- Parameters:
o (str | Number | date)
- static load_to_datetime(base_type=<class 'datetime.datetime'>, default=None, raise_=True)¶
Attempt to convert an object o to a
datetime
object using the below logic.str
: convert datetime strings (in ISO format) via the built-infromisoformat
method.Number
(int or float): Convert a numeric timestamp via thebuilt-in
fromtimestamp
method, and return a UTC datetime.
datetime
: Return object o if it’s already of this type orsub-type.
Otherwise, if we’re unable to convert the value of o to a
datetime
as expected, raise an error if the raise_ parameter is true; if not, return default instead.- Parameters:
o (str | Number | datetime)
- static load_to_decimal(o, base_type)[source]¶
- Return type:
Decimal
- Parameters:
o (int | float)
base_type (Type[Decimal])
- static load_to_defaultdict(o, base_type, default_factory, key_parser, val_parser)[source]¶
- Return type:
TypeVar
(DD
, bound=defaultdict
)- Parameters:
o (Dict)
base_type (Type[DD])
default_factory (Callable[[], T])
key_parser (AbstractParser)
val_parser (AbstractParser)
- static load_to_dict(o, base_type, key_parser, val_parser)[source]¶
- Return type:
TypeVar
(M
, bound=Mapping
)- Parameters:
o (Dict)
base_type (Type[M])
key_parser (AbstractParser)
val_parser (AbstractParser)
- static load_to_enum(o, base_type)[source]¶
- Return type:
TypeVar
(E
, bound=Enum
)- Parameters:
o (AnyStr | int | float)
base_type (Type[E])
- static load_to_float(o, base_type)[source]¶
- Return type:
Union
[int
,float
]- Parameters:
o (SupportsFloat | str)
base_type (Type[int | float])
- static load_to_int(base_type=<class 'int'>, default=0, raise_=True)¶
Return o if already a int, otherwise return the int value for a string. If o is None or an empty string, return default instead.
If o cannot be converted to an int, raise an error if raise_ is true, other return default instead.
- Raises:
TypeError – If o is a bool (which is an int sub-class)
ValueError – When o cannot be converted to an int, and the raise_ parameter is true
- Parameters:
o (str | int | float | bool | None)
- static load_to_iterable(o, base_type, elem_parser)[source]¶
- Return type:
TypeVar
(LSQ
,list
,set
,frozenset
,deque
)- Parameters:
o (Iterable)
base_type (Type[LSQ])
elem_parser (AbstractParser)
- static load_to_named_tuple(o, base_type, field_to_parser, field_parsers)[source]¶
- Return type:
TypeVar
(NT
, bound=NamedTuple
)- Parameters:
o (Dict | List | Tuple)
base_type (Type[NT])
field_to_parser (FieldToParser)
field_parsers (List[AbstractParser])
- static load_to_named_tuple_untyped(o, base_type, dict_parser, list_parser)[source]¶
- Return type:
TypeVar
(NT
, bound=NamedTuple
)- Parameters:
o (Dict | List | Tuple)
base_type (Type[NT])
dict_parser (AbstractParser)
list_parser (AbstractParser)
- static load_to_path(o, base_type)[source]¶
- Return type:
Path
- Parameters:
o (int | float)
base_type (Type[Path])
- static load_to_str(base_type=<class 'str'>)¶
Return o if already a str, otherwise return the string value for o. If o is None, return an empty string instead.
- Parameters:
o (str | None)
- static load_to_time(base_type=<class 'datetime.time'>, default=None, raise_=True)¶
Attempt to convert an object o to a
time
object using the below logic.str
: convert time strings (in ISO format) via the built-infromisoformat
method.time
: Return object o if it’s already of this type orsub-type.
Otherwise, if we’re unable to convert the value of o to a
time
as expected, raise an error if the raise_ parameter is true; if not, return default instead.- Parameters:
o (str | time)
- static load_to_timedelta(base_type=<class 'datetime.timedelta'>, default=None, raise_=True)¶
Attempt to convert an object o to a
timedelta
object using the below logic.str
: If the string is in a numeric form like “1.23”, we convert it to afloat
and assume it’s in seconds. Otherwise, we convert strings via thepytimeparse.parse
function.int
orfloat
: A numeric value is assumed to be in seconds. In this case, it is passed in to the constructor liketimedelta(seconds=...)
timedelta
: Return object o if it’s already of this type orsub-type.
Otherwise, if we’re unable to convert the value of o to a
timedelta
as expected, raise an error if the raise_ parameter is true; if not, return default instead.- Parameters:
o (str | int | float | timedelta)
- static load_to_tuple(o, base_type, elem_parsers)[source]¶
- Return type:
Tuple
- Parameters:
o (List | Tuple)
base_type (Type[Tuple])
elem_parsers (Sequence[AbstractParser])
- static load_to_typed_dict(o, base_type, key_to_parser, required_keys, optional_keys)[source]¶
- Return type:
TypeVar
(M
, bound=Mapping
)- Parameters:
o (Dict)
base_type (Type[M])
key_to_parser (FieldToParser)
required_keys (frozenset[str])
optional_keys (frozenset[str])
- static load_to_uuid(o, base_type)[source]¶
- Return type:
TypeVar
(U
, bound=UUID
)- Parameters:
o (AnyStr | U)
base_type (Type[U])
- static transform_json_field()¶
Make an underscored, lowercase form from the expression in the string.
Example:
>>> to_snake_case("DeviceType") 'device_type'
- Return type:
str
- Parameters:
string (str)
dataclass_wizard.log module¶
dataclass_wizard.models module¶
- class dataclass_wizard.models.Extras[source]¶
Bases:
TypedDict
“Extra” config that can be used in the load / dump process.
-
cls:
type
¶
-
cls_name:
str
¶
-
config:
NotRequired
[META]¶
- fn_gen: FunctionBuilder¶
-
locals:
dict
[str
,Any
]¶
-
pattern:
NotRequired
[PatternedDT]¶
-
cls:
- class dataclass_wizard.models.JSON(*keys, all=False, dump=True, path=False)[source]¶
Bases:
object
- all¶
- dump¶
- keys¶
- path¶
- class dataclass_wizard.models.JSONField(keys, all, dump, default, default_factory, init, repr, hash, compare, metadata, path=False)[source]¶
Bases:
Field
- Parameters:
all (bool)
dump (bool)
path (bool)
- json¶
- class dataclass_wizard.models.PatternedDT(pattern, cls=None)[source]¶
Bases:
Generic
[DT
]- cls¶
- pattern¶
- dataclass_wizard.models.SkipIf(condition)[source]¶
Mark a condition to be used as a skip directive during serialization.
- dataclass_wizard.models.env_field(keys, *, all=False, dump=True, default=<dataclasses._MISSING_TYPE object>, default_factory=<dataclasses._MISSING_TYPE object>, init=True, repr=True, hash=None, compare=True, metadata=None)¶
- dataclass_wizard.models.finalize_skip_if(skip_if, operand_1, conditional)[source]¶
Finalizes the skip condition by generating the appropriate string based on the condition.
- Args:
skip_if (Condition): The condition to evaluate, containing truthiness and operation info. operand_1 (str): The primary operand for the condition (e.g., a variable or value). conditional (str): The conditional operator to use (e.g., ‘==’, ‘!=’).
- Returns:
str: The resulting skip condition as a string.
- Example:
>>> cond = Condition(t_or_f=True, op='+', val=None) >>> finalize_skip_if(cond, 'my_var', '==') 'my_var'
- dataclass_wizard.models.get_skip_if_condition(skip_if, _locals, operand_2)[source]¶
Retrieves the skip condition based on the provided Condition object.
- Args:
skip_if (Condition): The condition to evaluate. _locals (dict[str, Any]): A dictionary of local variables for condition evaluation. operand_2 (str): The secondary operand (e.g., a variable or value).
- Returns:
Any: The result of the evaluated condition or a string representation for custom values.
- Example:
>>> cond = Condition(t_or_f=False, op='==', val=10) >>> locals_dict = {} >>> get_skip_if_condition(cond, locals_dict, 'other_var') '== other_var'
- dataclass_wizard.models.json_field(keys, *, all=False, dump=True, default=<dataclasses._MISSING_TYPE object>, default_factory=<dataclasses._MISSING_TYPE object>, init=True, repr=True, hash=None, compare=True, metadata=None)[source]¶
dataclass_wizard.parsers module¶
- class dataclass_wizard.parsers.DefaultDictParser(cls, extras, base_type, hook, get_parser)[source]¶
Bases:
MappingParser
[DD
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (Type[DD])
hook (Callable[[Any, Type[DD], Callable[[], T], AbstractParser, AbstractParser], DD])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
- default_factory¶
- class dataclass_wizard.parsers.IdentityParser(cls, extras, base_type)[source]¶
Bases:
AbstractParser
[Type
[T
],T
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (type[T])
-
cls:
InitVar
¶
-
extras:
InitVar
¶
- class dataclass_wizard.parsers.IterableParser(cls, extras, base_type, hook, get_parser)[source]¶
Bases:
AbstractParser
[Type
[LSQ
],LSQ
]Parser for a
list
,set
,frozenset
,deque
, or a subclass of either type.- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (Type[LSQ])
hook (Callable[[Iterable, Type[LSQ], AbstractParser], LSQ])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
- elem_parser¶
-
get_parser:
InitVar
¶
-
hook:
Callable
[[Iterable
,Type
[TypeVar
(LSQ
,list
,set
,frozenset
,deque
)],AbstractParser
],TypeVar
(LSQ
,list
,set
,frozenset
,deque
)]¶
- class dataclass_wizard.parsers.LiteralParser(cls, extras, base_type)[source]¶
Bases:
AbstractParser
[M
,M
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (type[M])
- value_to_type¶
- class dataclass_wizard.parsers.MappingParser(cls, extras, base_type, hook, get_parser)[source]¶
Bases:
AbstractParser
[Type
[M
],M
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (Type[M])
hook (Callable[[Any, Type[M], AbstractParser, AbstractParser], M])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
-
get_parser:
InitVar
¶
-
hook:
Callable
[[Any
,Type
[TypeVar
(M
, bound=Mapping
)],AbstractParser
,AbstractParser
],TypeVar
(M
, bound=Mapping
)]¶
- key_parser¶
- val_parser¶
- val_type¶
- class dataclass_wizard.parsers.NamedTupleParser(cls, extras, base_type, hook, get_parser)[source]¶
Bases:
AbstractParser
[tuple
,NT
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (type[T])
hook (Callable[[Any, type[tuple], FieldToParser | None, List[AbstractParser]], NT])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
- field_parsers¶
- field_to_parser¶
-
get_parser:
InitVar
¶
-
hook:
Callable
[[Any
,type
[tuple
],Optional
[FieldToParser],List
[AbstractParser
]],TypeVar
(NT
, bound=NamedTuple
)]¶
- class dataclass_wizard.parsers.NamedTupleUntypedParser(cls, extras, base_type, hook, get_parser)[source]¶
Bases:
AbstractParser
[tuple
,NT
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (type[T])
hook (Callable[[Any, Type[tuple], AbstractParser, AbstractParser], NT])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
- dict_parser¶
-
get_parser:
InitVar
¶
-
hook:
Callable
[[Any
,Type
[tuple
],AbstractParser
,AbstractParser
],TypeVar
(NT
, bound=NamedTuple
)]¶
- list_parser¶
- class dataclass_wizard.parsers.OptionalParser(cls, extras, base_type, get_parser)[source]¶
Bases:
AbstractParser
[T
,T
|None
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (type[T])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
-
get_parser:
InitVar
¶
- parser¶
- class dataclass_wizard.parsers.Parser(cls, extras, base_type, hook)[source]¶
Bases:
AbstractParser
[T
,T
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (type[T])
hook (Callable[[Any, type[T]], T])
-
hook:
Callable
[[Any
,type
[TypeVar
(T
)]],TypeVar
(T
)]¶
- class dataclass_wizard.parsers.PatternedDTParser(cls, extras, base_type)[source]¶
Bases:
AbstractParser
[PatternedDT
,DT
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (PatternedDT)
- hook¶
- class dataclass_wizard.parsers.RecursionSafeParser(cls, extras, base_type, hook)[source]¶
Bases:
AbstractParser
Parser to handle cyclic or self-referential dataclasses.
For example:
@dataclass class A: a: A | None = None instance = fromdict(A, {'a': {'a': {'a': None}}})
- Parameters:
cls (dataclasses.InitVar[Type])
extras (Extras)
base_type (type[T])
hook (Callable[[Any], T] | None)
-
hook:
Optional
[Callable
[[Any
],TypeVar
(T
)]]¶
- class dataclass_wizard.parsers.SingleArgParser(cls, extras, base_type, hook)[source]¶
Bases:
AbstractParser
[Type
[T
],T
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (type[T])
hook (Callable[[Any], T])
-
hook:
Callable
[[Any
],TypeVar
(T
)]¶
- class dataclass_wizard.parsers.TupleParser(cls, extras, base_type, hook, get_parser)[source]¶
Bases:
AbstractParser
[Type
[S
],S
]Parser for subscripted and un-subscripted
Tuple
’s.See
VariadicTupleParser
for the parser that handles the variadic form, i.e.Tuple[str, ...]
- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (Type[S])
hook (Callable[[Any, Type[S], Tuple[AbstractParser, ...] | None], S])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
- elem_parsers¶
- elem_types¶
-
get_parser:
InitVar
¶
-
hook:
Callable
[[Any
,Type
[TypeVar
(S
, bound=Sequence
)],Optional
[Tuple
[AbstractParser
,...
]]],TypeVar
(S
, bound=Sequence
)]¶
- required_count¶
- total_count¶
- class dataclass_wizard.parsers.TypedDictParser(cls, extras, base_type, hook, get_parser)[source]¶
Bases:
AbstractParser
[Type
[M
],M
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (Type[M])
hook (Callable[[Any, Type[M], FieldToParser, frozenset[str], frozenset[str]], M])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
-
get_parser:
InitVar
¶
-
hook:
Callable
[[Any
,Type
[TypeVar
(M
, bound=Mapping
)], FieldToParser,frozenset
[str
],frozenset
[str
]],TypeVar
(M
, bound=Mapping
)]¶
- key_to_parser¶
- optional_keys¶
- required_keys¶
- class dataclass_wizard.parsers.UnionParser(cls, extras, base_type, get_parser)[source]¶
Bases:
AbstractParser
[Tuple
[Type
[T
], …],T
|None
]- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (Tuple[Type[T], ...])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
-
get_parser:
InitVar
¶
- parsers¶
- tag_key¶
- tag_to_parser¶
- class dataclass_wizard.parsers.VariadicTupleParser(cls, extras, base_type, hook, get_parser)[source]¶
Bases:
TupleParser
Parser that handles the variadic form of
Tuple
’s, i.e.Tuple[str, ...]
Per PEP 484, only one required type is allowed before the
Ellipsis
. That is,Tuple[int, ...]
is valid whereasTuple[int, str, ...]
would be invalid. See here for more info.- Parameters:
cls (dataclasses.InitVar[Type])
extras (dataclasses.InitVar[Extras])
base_type (Type[S])
hook (Callable[[Any, Type[S], Tuple[AbstractParser, ...] | None], S])
get_parser (dataclasses.InitVar[Callable[[Type[~T], Type, dataclass_wizard.models.Extras], dataclass_wizard.abstractions.AbstractParser]])
-
cls:
InitVar
¶
-
extras:
InitVar
¶
- first_elem_parser¶
-
get_parser:
InitVar
¶
dataclass_wizard.property_wizard module¶
- dataclass_wizard.property_wizard.property_wizard(*args, **kwargs)[source]¶
Adds support for field properties with default values in dataclasses.
For examples of usage, please see the Using Field Properties section in the docs. I also added an answer on a SO article that deals with using such properties in dataclasses.
dataclass_wizard.serial_json module¶
- class dataclass_wizard.serial_json.JSONPyWizard[source]¶
Bases:
JSONSerializable
Helper for JSONWizard that ensures dumping to JSON keeps keys as-is.
- class dataclass_wizard.serial_json.JSONSerializable[source]¶
Bases:
AbstractJSONWizard
- class Meta[source]¶
Bases:
BaseJSONWizardMeta
- classmethod from_dict()¶
Converts a Python dictionary object to a dataclass instance.
Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.
When directly invoking this function, an optional Meta configuration for the dataclass can be specified via
LoadMeta
; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass) >>> fromdict(MyClass, {"myStr": "value"})
- Return type:
TypeVar
(T
)- Parameters:
d (dict[str, Any])
- classmethod from_list()¶
Converts a Python list object to a list of dataclass instances.
Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.
- Return type:
list
[TypeVar
(T
)]- Parameters:
list_of_dict (list[dict[str, Any]])
- to_dict(*, cls=None, dict_factory=<class 'dict'>, exclude=None, **kwargs)¶
Return the fields of a dataclass instance as a new dictionary mapping field names to field values.
Example usage: :rtype:
dict
[str
,Any
]@dataclass class C:
x: int y: int
c = C(1, 2) assert asdict(c) == {‘x’: 1, ‘y’: 2}
When directly invoking this function, an optional Meta configuration for the dataclass can be specified via
DumpMeta
; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass) >>> asdict(MyClass(my_str="value"))
If given, ‘dict_factory’ will be used instead of built-in dict. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts.
- Parameters:
o (T)
exclude (Collection[str] | None)
- Return type:
dict[str, Any]
- dataclass_wizard.serial_json.JSONWizard¶
alias of
JSONSerializable
dataclass_wizard.type_def module¶
- class dataclass_wizard.type_def.Decoder(*args, **kwargs)[source]¶
Bases:
Protocol
Represents a decoder for JSON -> Python object, e.g. analogous to json.loads
- class dataclass_wizard.type_def.Encoder(*args, **kwargs)[source]¶
Bases:
Protocol
Represents an encoder for Python object -> JSON, e.g. analogous to json.dumps
- class dataclass_wizard.type_def.FileDecoder(*args, **kwargs)[source]¶
Bases:
Protocol
Represents a decoder for JSON file -> Python object, e.g. analogous to json.load
- class dataclass_wizard.type_def.FileEncoder(*args, **kwargs)[source]¶
Bases:
Protocol
Represents an encoder for Python object -> JSON file, e.g. analogous to json.dump
- class dataclass_wizard.type_def.NoneType¶
Bases:
object
The type of the None singleton.
- dataclass_wizard.type_def.PyForwardRef¶
alias of
ForwardRef
- dataclass_wizard.type_def.PyProtocol¶
alias of
Protocol
- dataclass_wizard.type_def.PyTypedDict(typename, fields=<sentinel>, /, *, total=True)¶
A simple typed namespace. At runtime it is equivalent to a plain dict.
TypedDict creates a dictionary type such that a type checker will expect all instances to have a certain set of keys, where each key is associated with a value of a consistent type. This expectation is not checked at runtime.
Usage:
>>> class Point2D(TypedDict): ... x: int ... y: int ... label: str ... >>> a: Point2D = {'x': 1, 'y': 2, 'label': 'good'} # OK >>> b: Point2D = {'z': 3, 'label': 'bad'} # Fails type check >>> Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first') True
The type info can be accessed via the Point2D.__annotations__ dict, and the Point2D.__required_keys__ and Point2D.__optional_keys__ frozensets. TypedDict supports an additional equivalent form:
Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': str})
By default, all keys must be present in a TypedDict. It is possible to override this by specifying totality:
class Point2D(TypedDict, total=False): x: int y: int
This means that a Point2D TypedDict can have any of the keys omitted. A type checker is only expected to support a literal False or True as the value of the total argument. True is the default, and makes all items defined in the class body be required.
The Required and NotRequired special forms can also be used to mark individual keys as being required or not required:
class Point2D(TypedDict): x: int # the "x" key must always be present (Required is the default) y: NotRequired[int] # the "y" key can be omitted
See PEP 655 for more details on Required and NotRequired.
The ReadOnly special form can be used to mark individual keys as immutable for type checkers:
class DatabaseUser(TypedDict): id: ReadOnly[int] # the "id" key must not be modified username: str # the "username" key can be changed
- dataclass_wizard.type_def.dataclass_transform(*, eq_default=True, order_default=False, kw_only_default=False, frozen_default=False, field_specifiers=(), **kwargs)[source]¶
Decorator to mark an object as providing dataclass-like behaviour.
The decorator can be applied to a function, class, or metaclass.
Example usage with a decorator function:
@dataclass_transform() def create_model[T](cls: type[T]) -> type[T]: ... return cls @create_model class CustomerModel: id: int name: str
On a base class:
@dataclass_transform() class ModelBase: ... class CustomerModel(ModelBase): id: int name: str
On a metaclass:
@dataclass_transform() class ModelMeta(type): ... class ModelBase(metaclass=ModelMeta): ... class CustomerModel(ModelBase): id: int name: str
The
CustomerModel
classes defined above will be treated by type checkers similarly to classes created with@dataclasses.dataclass
. For example, type checkers will assume these classes have__init__
methods that acceptid
andname
.- Return type:
_IdentityCallable
- Parameters:
eq_default (bool)
order_default (bool)
kw_only_default (bool)
frozen_default (bool)
field_specifiers (tuple[type[Any] | Callable[[...], Any], ...])
kwargs (Any)
The arguments to this decorator can be used to customize this behavior: -
eq_default
indicates whether theeq
parameter is assumed to beTrue
orFalse
if it is omitted by the caller.order_default
indicates whether theorder
parameter isassumed to be True or False if it is omitted by the caller.
kw_only_default
indicates whether thekw_only
parameter isassumed to be True or False if it is omitted by the caller.
frozen_default
indicates whether thefrozen
parameter isassumed to be True or False if it is omitted by the caller.
field_specifiers
specifies a static list of supported classesor functions that describe fields, similar to
dataclasses.field()
.
- Arbitrary other keyword arguments are accepted in order to allow for
possible future extensions.
At runtime, this decorator records its arguments in the
__dataclass_transform__
attribute on the decorated object. It has no other runtime effect.See PEP 681 for more details.
dataclass_wizard.wizard_mixins module¶
Helper Wizard Mixin classes.
- class dataclass_wizard.wizard_mixins.JSONFileWizard[source]¶
Bases:
object
A Mixin class that makes it easier to interact with JSON files.
This can be paired with the
JSONSerializable
(JSONWizard) Mixin class for more complete extensibility.
- class dataclass_wizard.wizard_mixins.JSONListWizard[source]¶
Bases:
JSONSerializable
A Mixin class that extends
JSONSerializable
(JSONWizard) to returnContainer
- instead of list - objects.Note that Container objects are simply convenience wrappers around a collection of dataclass instances. For all intents and purposes, they behave exactly the same as list objects, with some added helper methods:
prettify
- Convert the list of instances to a prettified JSON string.to_json
- Convert the list of instances to a JSON string.to_json_file
- Serialize the list of instances and write it to a JSON file.
- class dataclass_wizard.wizard_mixins.TOMLWizard[source]¶
Bases:
object
A Mixin class that makes it easier to interact with TOML data.
Note
By default, NO key transform is used in the TOML dump process. In practice, this means that a snake_case field name in Python is saved as snake_case to TOML; however, this can easily be customized without the need to sub-class from
JSONWizard
.For example:
>>> @dataclass >>> class MyClass(TOMLWizard, key_transform='CAMEL'): >>> ...
- classmethod from_toml(string_or_stream, *, decoder=None, header='items', parse_float=<class 'float'>)[source]¶
Converts a TOML string to an instance of the dataclass, or a list of the dataclass instances.
If
header
is provided and the corresponding value in the parsed data is alist
, the return type isList[T]
.
- classmethod from_toml_file(file, *, decoder=None, header='items', parse_float=<class 'float'>)[source]¶
Reads the contents of a TOML file and converts them into an instance (or list of instances) of the dataclass.
Similar to
from_toml()
, it can return a list ifheader
is specified and points to a list in the TOML data.
- classmethod list_to_toml(instances, header='items', encoder=None, **encoder_kwargs)[source]¶
Serializes a
list
of dataclass instances into a TOML string, grouped under a specified header.
- class dataclass_wizard.wizard_mixins.YAMLWizard[source]¶
Bases:
object
A Mixin class that makes it easier to interact with YAML data.
Note
The default key transform used in the YAML dump process is lisp-case, however this can easily be customized without the need to sub-class from
JSONWizard
.For example:
>>> @dataclass >>> class MyClass(YAMLWizard, key_transform='CAMEL'): >>> ...
- classmethod from_yaml(string_or_stream, *, decoder=None, **decoder_kwargs)[source]¶
Converts a YAML string to an instance of the dataclass, or a list of the dataclass instances.
- classmethod from_yaml_file(file, *, decoder=None, **decoder_kwargs)[source]¶
Reads in the YAML file contents and converts to an instance of the dataclass, or a list of the dataclass instances.
- classmethod list_to_yaml(instances, encoder=None, **encoder_kwargs)[source]¶
Converts a
list
of dataclass instances to a YAML string representation.
Module contents¶
Dataclass Wizard¶
Lightning-fast JSON wizardry for Python dataclasses — effortless serialization right out of the box!
Sample Usage:
>>> from dataclasses import dataclass, field
>>> from datetime import datetime
>>> from typing import Optional
>>>
>>> from dataclass_wizard import JSONSerializable, property_wizard
>>>
>>>
>>> @dataclass
>>> class MyClass(JSONSerializable, metaclass=property_wizard):
>>>
>>> my_str: Optional[str]
>>> list_of_int: list[int] = field(default_factory=list)
>>> # You can also define this as `my_dt`, however only the annotation
>>> # will carry over in that case, since the value is re-declared by
>>> # the property below.
>>> _my_dt: datetime = datetime(2000, 1, 1)
>>>
>>> @property
>>> def my_dt(self):
>>> # A sample `getter` which returns the datetime with year set as 2010
>>> if self._my_dt is not None:
>>> return self._my_dt.replace(year=2010)
>>> return self._my_dt
>>>
>>> @my_dt.setter
>>> def my_dt(self, new_dt: datetime):
>>> # A sample `setter` which sets the inverse (roughly) of the `month` and `day`
>>> self._my_dt = new_dt.replace(month=13 - new_dt.month,
>>> day=30 - new_dt.day)
>>>
>>>
>>> string = '''{"myStr": 42, "listOFInt": [1, "2", 3]}'''
>>> c = MyClass.from_json(string)
>>> print(repr(c))
>>> # prints:
>>> # MyClass(
>>> # my_str='42',
>>> # list_of_int=[1, 2, 3],
>>> # my_dt=datetime.datetime(2010, 12, 29, 0, 0)
>>> # )
>>> my_dict = {'My_Str': 'string', 'myDT': '2021-01-20T15:55:30Z'}
>>> c = MyClass.from_dict(my_dict)
>>> print(repr(c))
>>> # prints:
>>> # MyClass(
>>> # my_str='string',
>>> # list_of_int=[],
>>> # my_dt=datetime.datetime(2010, 12, 10, 15, 55, 30,
>>> # tzinfo=datetime.timezone.utc)
>>> # )
>>> print(c.to_json())
>>> # prints:
>>> # {"myStr": "string", "listOfInt": [], "myDt": "2010-12-10T15:55:30Z"}
For full documentation and more advanced usage, please see <https://dataclass-wizard.readthedocs.io>.
- copyright:
2021-2025 by Ritvik Nag.
- license:
Apache 2.0, see LICENSE for more details.
- dataclass_wizard.DumpMeta(**kwargs)[source]¶
Helper function to setup the
Meta
Config for the JSON dump (serialization) process, which is intended for use alongside theasdict
helper function.For descriptions on what each of these params does, refer to the Docs below, or check out the
AbstractMeta
definition (I want to avoid duplicating the descriptions for params here).Examples:
>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass) >>> asdict(MyClass, {"myStr": "value"})
- Return type:
Type
[TypeVar
(META_
, bound= AbstractMeta)]
- class dataclass_wizard.DumpMixin[source]¶
Bases:
AbstractDumper
,BaseDumpHook
This Mixin class derives its name from the eponymous json.dumps function. Essentially it contains helper methods to convert Python built-in types to a more ‘JSON-friendly’ version.
- static transform_dataclass_field()¶
Convert a string to Camel Case.
Examples:
>>> to_camel_case("device_type") 'deviceType'
- Return type:
str
- Parameters:
string (str)
- dataclass_wizard.EnvMeta(**kwargs)[source]¶
Helper function to setup the
Meta
Config for the EnvWizard.For descriptions on what each of these params does, refer to the Docs below, or check out the
AbstractEnvMeta
definition (I want to avoid duplicating the descriptions for params here).Examples:
>>> EnvMeta(key_transform_with_dump='SNAKE').bind_to(MyClass)
- Return type:
Type
[TypeVar
(META_
, bound= AbstractMeta)]
- class dataclass_wizard.EnvWizard[source]¶
Bases:
AbstractEnvWizard
Environment Wizard
A mixin class for parsing and managing environment variables in Python.
EnvWizard
makes it easy to map environment variables to Python attributes, handle defaults, and optionally load values from .env files.Quick Example:
import os from pathlib import Path class MyConfig(EnvWizard): my_var: str my_optional_var: int = 42 # Set environment variables os.environ["MY_VAR"] = "hello" # Load configuration from the environment config = MyConfig() print(config.my_var) # Output: "hello" print(config.my_optional_var) # Output: 42 # Specify configuration explicitly config = MyConfig(my_var='world') print(config.my_var) # Output: "world" print(config.my_optional_var) # Output: 42
Example with
.env
file:class MyConfigWithEnvFile(EnvWizard): class _(EnvWizard.Meta): env_file = True # Defaults to loading from `.env` my_var: str my_optional_var: int = 42 # Create an `.env` file in the current directory: # MY_VAR=world config = MyConfigWithEnvFile() print(config.my_var) # Output: "world" print(config.my_optional_var) # Output: 42
- Key Features:
Automatically maps environment variables to dataclass fields.
Supports default values for fields if environment variables are not set.
Optionally loads environment variables from .env files.
Supports prefixes for environment variables using
_env_prefix
orMeta.env_prefix
.Supports loading secrets from directories using
_secrets_dir
orMeta.secrets_dir
.Dynamic reloading with
_reload
to handle updated environment values.
- Initialization Options:
The
__init__
method accepts additional parameters for flexibility:_env_file
(optional):Overrides the
Meta.env_file
value dynamically. Can be a file path, a sequence of file paths, orTrue
to use the default .env file.
_reload
(optional):Forces a reload of environment variables to bypass caching. Defaults to
False
.
_env_prefix
(optional):Dynamically overrides
Meta.env_prefix
, applying a prefix to all environment variables. Defaults toNone
.
_secrets_dir
(optional):Overrides the
Meta.secrets_dir
value dynamically. Can be a directory path or a sequence of paths pointing to directories containing secret files.
- Meta Settings:
These class-level attributes can be configured in a nested
Meta
class:env_file
:The path(s) to .env files to load. If set to
True
, defaults to .env.
env_prefix
:A prefix applied to all environment variables. Defaults to
None
.
secrets_dir
:A path or sequence of paths to directories containing secret files. Defaults to
None
.
- Attributes:
Defined dynamically based on the dataclass fields in the derived class.
- class Meta[source]¶
Bases:
BaseEnvWizardMeta
Inner meta class that can be extended by sub-classes for additional customization with the environment load process.
- to_dict(*, cls=None, dict_factory=<class 'dict'>, exclude=None, **kwargs)¶
Return the fields of an instance of a EnvWizard subclass as a new dictionary mapping field names to field values.
Example usage:
class MyEnv(EnvWizard): x: int y: str env = MyEnv() serialized = asdict(env)
When directly invoking this function, an optional Meta configuration for the EnvWizard subclass can be specified via
EnvMeta
; by default, this will apply recursively to any nested subclasses. Here’s a sample usage of this below:>>> EnvMeta(key_transform_with_dump='CAMEL').bind_to(MyClass) >>> asdict(MyClass(my_str="value"))
If given, ‘dict_factory’ will be used instead of built-in dict. The function applies recursively to field values that are EnvWizard subclasses. This will also look into built-in containers: tuples, lists, and dicts.
- Return type:
dict
[str
,Any
]- Parameters:
o (T)
exclude (Collection[str] | None)
- class dataclass_wizard.JSONFileWizard[source]¶
Bases:
object
A Mixin class that makes it easier to interact with JSON files.
This can be paired with the
JSONSerializable
(JSONWizard) Mixin class for more complete extensibility.
- class dataclass_wizard.JSONListWizard[source]¶
Bases:
JSONSerializable
A Mixin class that extends
JSONSerializable
(JSONWizard) to returnContainer
- instead of list - objects.Note that Container objects are simply convenience wrappers around a collection of dataclass instances. For all intents and purposes, they behave exactly the same as list objects, with some added helper methods:
prettify
- Convert the list of instances to a prettified JSON string.to_json
- Convert the list of instances to a JSON string.to_json_file
- Serialize the list of instances and write it to a JSON file.
- class dataclass_wizard.JSONPyWizard[source]¶
Bases:
JSONSerializable
Helper for JSONWizard that ensures dumping to JSON keeps keys as-is.
- class dataclass_wizard.JSONSerializable[source]¶
Bases:
AbstractJSONWizard
- class Meta[source]¶
Bases:
BaseJSONWizardMeta
- classmethod from_dict()¶
Converts a Python dictionary object to a dataclass instance.
Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.
When directly invoking this function, an optional Meta configuration for the dataclass can be specified via
LoadMeta
; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass) >>> fromdict(MyClass, {"myStr": "value"})
- Return type:
TypeVar
(T
)- Parameters:
d (dict[str, Any])
- classmethod from_list()¶
Converts a Python list object to a list of dataclass instances.
Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.
- Return type:
list
[TypeVar
(T
)]- Parameters:
list_of_dict (list[dict[str, Any]])
- to_dict(*, cls=None, dict_factory=<class 'dict'>, exclude=None, **kwargs)¶
Return the fields of a dataclass instance as a new dictionary mapping field names to field values.
Example usage: :rtype:
dict
[str
,Any
]@dataclass class C:
x: int y: int
c = C(1, 2) assert asdict(c) == {‘x’: 1, ‘y’: 2}
When directly invoking this function, an optional Meta configuration for the dataclass can be specified via
DumpMeta
; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass) >>> asdict(MyClass(my_str="value"))
If given, ‘dict_factory’ will be used instead of built-in dict. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts.
- Parameters:
o (T)
exclude (Collection[str] | None)
- Return type:
dict[str, Any]
- dataclass_wizard.JSONWizard¶
alias of
JSONSerializable
- dataclass_wizard.LoadMeta(**kwargs)[source]¶
Helper function to setup the
Meta
Config for the JSON load (de-serialization) process, which is intended for use alongside thefromdict
helper function.For descriptions on what each of these params does, refer to the Docs below, or check out the
AbstractMeta
definition (I want to avoid duplicating the descriptions for params here).Examples:
>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass) >>> fromdict(MyClass, {"myStr": "value"})
- Return type:
Type
[TypeVar
(META_
, bound= AbstractMeta)]
- class dataclass_wizard.LoadMixin[source]¶
Bases:
AbstractLoader
,BaseLoadHook
This Mixin class derives its name from the eponymous json.loads function. Essentially it contains helper methods to convert JSON strings (or a Python dictionary object) to a dataclass which can often contain complex types such as lists, dicts, or even other dataclasses nested within it.
Refer to the
AbstractLoader
class for documentation on any of the implemented methods.- classmethod get_parser_for_annotation(ann_type, base_cls=None, extras=None)[source]¶
Returns the Parser (dispatcher) for a given annotation type.
- Return type:
Union
[AbstractParser
,Callable
[[dict
[str
,Any
]],TypeVar
(T
)]]- Parameters:
ann_type (Type[T])
base_cls (Type)
extras (Extras)
- static load_after_type_check(o, base_type)[source]¶
- Return type:
TypeVar
(T
)- Parameters:
o (Any)
base_type (Type[T])
- static load_func_for_dataclass(cls, config)[source]¶
- Return type:
Callable
[[dict
[str
,Any
]],TypeVar
(T
)]- Parameters:
cls (Type[T])
config (Type[META_] | None)
- static load_to_bool(o, _)[source]¶
- Return type:
bool
- Parameters:
o (str | bool | int | float)
_ (Type[bool])
- static load_to_date(base_type=<class 'datetime.date'>, default=None, raise_=True)¶
Attempt to convert an object o to a
date
object using the below logic.str
: convert date strings (in ISO format) via the built-infromisoformat
method.Number
(int or float): Convert a numeric timestamp via thebuilt-in
fromtimestamp
method.
date
: Return object o if it’s already of this type orsub-type.
Otherwise, if we’re unable to convert the value of o to a
date
as expected, raise an error if the raise_ parameter is true; if not, return default instead.- Parameters:
o (str | Number | date)
- static load_to_datetime(base_type=<class 'datetime.datetime'>, default=None, raise_=True)¶
Attempt to convert an object o to a
datetime
object using the below logic.str
: convert datetime strings (in ISO format) via the built-infromisoformat
method.Number
(int or float): Convert a numeric timestamp via thebuilt-in
fromtimestamp
method, and return a UTC datetime.
datetime
: Return object o if it’s already of this type orsub-type.
Otherwise, if we’re unable to convert the value of o to a
datetime
as expected, raise an error if the raise_ parameter is true; if not, return default instead.- Parameters:
o (str | Number | datetime)
- static load_to_decimal(o, base_type)[source]¶
- Return type:
Decimal
- Parameters:
o (int | float)
base_type (Type[Decimal])
- static load_to_defaultdict(o, base_type, default_factory, key_parser, val_parser)[source]¶
- Return type:
TypeVar
(DD
, bound=defaultdict
)- Parameters:
o (Dict)
base_type (Type[DD])
default_factory (Callable[[], T])
key_parser (AbstractParser)
val_parser (AbstractParser)
- static load_to_dict(o, base_type, key_parser, val_parser)[source]¶
- Return type:
TypeVar
(M
, bound=Mapping
)- Parameters:
o (Dict)
base_type (Type[M])
key_parser (AbstractParser)
val_parser (AbstractParser)
- static load_to_enum(o, base_type)[source]¶
- Return type:
TypeVar
(E
, bound=Enum
)- Parameters:
o (AnyStr | int | float)
base_type (Type[E])
- static load_to_float(o, base_type)[source]¶
- Return type:
Union
[int
,float
]- Parameters:
o (SupportsFloat | str)
base_type (Type[int | float])
- static load_to_int(base_type=<class 'int'>, default=0, raise_=True)¶
Return o if already a int, otherwise return the int value for a string. If o is None or an empty string, return default instead.
If o cannot be converted to an int, raise an error if raise_ is true, other return default instead.
- Raises:
TypeError – If o is a bool (which is an int sub-class)
ValueError – When o cannot be converted to an int, and the raise_ parameter is true
- Parameters:
o (str | int | float | bool | None)
- static load_to_iterable(o, base_type, elem_parser)[source]¶
- Return type:
TypeVar
(LSQ
,list
,set
,frozenset
,deque
)- Parameters:
o (Iterable)
base_type (Type[LSQ])
elem_parser (AbstractParser)
- static load_to_named_tuple(o, base_type, field_to_parser, field_parsers)[source]¶
- Return type:
TypeVar
(NT
, bound=NamedTuple
)- Parameters:
o (Dict | List | Tuple)
base_type (Type[NT])
field_to_parser (FieldToParser)
field_parsers (List[AbstractParser])
- static load_to_named_tuple_untyped(o, base_type, dict_parser, list_parser)[source]¶
- Return type:
TypeVar
(NT
, bound=NamedTuple
)- Parameters:
o (Dict | List | Tuple)
base_type (Type[NT])
dict_parser (AbstractParser)
list_parser (AbstractParser)
- static load_to_path(o, base_type)[source]¶
- Return type:
Path
- Parameters:
o (int | float)
base_type (Type[Path])
- static load_to_str(base_type=<class 'str'>)¶
Return o if already a str, otherwise return the string value for o. If o is None, return an empty string instead.
- Parameters:
o (str | None)
- static load_to_time(base_type=<class 'datetime.time'>, default=None, raise_=True)¶
Attempt to convert an object o to a
time
object using the below logic.str
: convert time strings (in ISO format) via the built-infromisoformat
method.time
: Return object o if it’s already of this type orsub-type.
Otherwise, if we’re unable to convert the value of o to a
time
as expected, raise an error if the raise_ parameter is true; if not, return default instead.- Parameters:
o (str | time)
- static load_to_timedelta(base_type=<class 'datetime.timedelta'>, default=None, raise_=True)¶
Attempt to convert an object o to a
timedelta
object using the below logic.str
: If the string is in a numeric form like “1.23”, we convert it to afloat
and assume it’s in seconds. Otherwise, we convert strings via thepytimeparse.parse
function.int
orfloat
: A numeric value is assumed to be in seconds. In this case, it is passed in to the constructor liketimedelta(seconds=...)
timedelta
: Return object o if it’s already of this type orsub-type.
Otherwise, if we’re unable to convert the value of o to a
timedelta
as expected, raise an error if the raise_ parameter is true; if not, return default instead.- Parameters:
o (str | int | float | timedelta)
- static load_to_tuple(o, base_type, elem_parsers)[source]¶
- Return type:
Tuple
- Parameters:
o (List | Tuple)
base_type (Type[Tuple])
elem_parsers (Sequence[AbstractParser])
- static load_to_typed_dict(o, base_type, key_to_parser, required_keys, optional_keys)[source]¶
- Return type:
TypeVar
(M
, bound=Mapping
)- Parameters:
o (Dict)
base_type (Type[M])
key_to_parser (FieldToParser)
required_keys (frozenset[str])
optional_keys (frozenset[str])
- static load_to_uuid(o, base_type)[source]¶
- Return type:
TypeVar
(U
, bound=UUID
)- Parameters:
o (AnyStr | U)
base_type (Type[U])
- static transform_json_field()¶
Make an underscored, lowercase form from the expression in the string.
Example:
>>> to_snake_case("DeviceType") 'device_type'
- Return type:
str
- Parameters:
string (str)
- dataclass_wizard.SkipIf(condition)[source]¶
Mark a condition to be used as a skip directive during serialization.
- class dataclass_wizard.TOMLWizard[source]¶
Bases:
object
A Mixin class that makes it easier to interact with TOML data.
Note
By default, NO key transform is used in the TOML dump process. In practice, this means that a snake_case field name in Python is saved as snake_case to TOML; however, this can easily be customized without the need to sub-class from
JSONWizard
.For example:
>>> @dataclass >>> class MyClass(TOMLWizard, key_transform='CAMEL'): >>> ...
- classmethod from_toml(string_or_stream, *, decoder=None, header='items', parse_float=<class 'float'>)[source]¶
Converts a TOML string to an instance of the dataclass, or a list of the dataclass instances.
If
header
is provided and the corresponding value in the parsed data is alist
, the return type isList[T]
.
- classmethod from_toml_file(file, *, decoder=None, header='items', parse_float=<class 'float'>)[source]¶
Reads the contents of a TOML file and converts them into an instance (or list of instances) of the dataclass.
Similar to
from_toml()
, it can return a list ifheader
is specified and points to a list in the TOML data.
- classmethod list_to_toml(instances, header='items', encoder=None, **encoder_kwargs)[source]¶
Serializes a
list
of dataclass instances into a TOML string, grouped under a specified header.
- class dataclass_wizard.YAMLWizard[source]¶
Bases:
object
A Mixin class that makes it easier to interact with YAML data.
Note
The default key transform used in the YAML dump process is lisp-case, however this can easily be customized without the need to sub-class from
JSONWizard
.For example:
>>> @dataclass >>> class MyClass(YAMLWizard, key_transform='CAMEL'): >>> ...
- classmethod from_yaml(string_or_stream, *, decoder=None, **decoder_kwargs)[source]¶
Converts a YAML string to an instance of the dataclass, or a list of the dataclass instances.
- classmethod from_yaml_file(file, *, decoder=None, **decoder_kwargs)[source]¶
Reads in the YAML file contents and converts to an instance of the dataclass, or a list of the dataclass instances.
- classmethod list_to_yaml(instances, encoder=None, **encoder_kwargs)[source]¶
Converts a
list
of dataclass instances to a YAML string representation.
- dataclass_wizard.asdict(o, *, cls=None, dict_factory=<class 'dict'>, exclude=None, **kwargs)[source]¶
Return the fields of a dataclass instance as a new dictionary mapping field names to field values.
Example usage: :rtype:
dict
[str
,Any
]@dataclass class C:
x: int y: int
c = C(1, 2) assert asdict(c) == {‘x’: 1, ‘y’: 2}
When directly invoking this function, an optional Meta configuration for the dataclass can be specified via
DumpMeta
; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:>>> DumpMeta(key_transform='CAMEL').bind_to(MyClass) >>> asdict(MyClass(my_str="value"))
If given, ‘dict_factory’ will be used instead of built-in dict. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts.
- Parameters:
o (T)
exclude (Collection[str] | None)
- Return type:
dict[str, Any]
- dataclass_wizard.env_field(keys, *, all=False, dump=True, default=<dataclasses._MISSING_TYPE object>, default_factory=<dataclasses._MISSING_TYPE object>, init=True, repr=True, hash=None, compare=True, metadata=None)¶
- dataclass_wizard.fromdict(cls, d)[source]¶
Converts a Python dictionary object to a dataclass instance.
Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.
When directly invoking this function, an optional Meta configuration for the dataclass can be specified via
LoadMeta
; by default, this will apply recursively to any nested dataclasses. Here’s a sample usage of this below:>>> LoadMeta(key_transform='CAMEL').bind_to(MyClass) >>> fromdict(MyClass, {"myStr": "value"})
- Return type:
TypeVar
(T
)- Parameters:
cls (type[T])
d (dict[str, Any])
- dataclass_wizard.fromlist(cls, list_of_dict)[source]¶
Converts a Python list object to a list of dataclass instances.
Iterates over each dataclass field recursively; lists, dicts, and nested dataclasses will likewise be initialized as expected.
- Return type:
list
[TypeVar
(T
)]- Parameters:
cls (type[T])
list_of_dict (list[dict[str, Any]])
- dataclass_wizard.json_field(keys, *, all=False, dump=True, default=<dataclasses._MISSING_TYPE object>, default_factory=<dataclasses._MISSING_TYPE object>, init=True, repr=True, hash=None, compare=True, metadata=None)[source]¶
- dataclass_wizard.path_field(keys, *, all=True, dump=True, default=<dataclasses._MISSING_TYPE object>, default_factory=<dataclasses._MISSING_TYPE object>, init=True, repr=True, hash=None, compare=True, metadata=None)[source]¶
- dataclass_wizard.property_wizard(*args, **kwargs)[source]¶
Adds support for field properties with default values in dataclasses.
For examples of usage, please see the Using Field Properties section in the docs. I also added an answer on a SO article that deals with using such properties in dataclasses.