Skip to content

Implement Overloaded.#92

Merged
dnwpark merged 5 commits intomainfrom
overload
Feb 11, 2026
Merged

Implement Overloaded.#92
dnwpark merged 5 commits intomainfrom
overload

Conversation

@dnwpark
Copy link
Contributor

@dnwpark dnwpark commented Feb 10, 2026

No description provided.

@vercel
Copy link

vercel bot commented Feb 10, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
python-typemap Ready Ready Preview, Comment Feb 10, 2026 11:03pm

functions: tuple[types.FunctionType, ...]


def _is_overloaded_function(func):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems a bad name, because it's getting the overloaded function.



def _is_overloaded_function(func):
module_overload_registry = typing._overload_registry[func.__module__]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we not use typing.get_overloads? If there's some reason we can't, document it here.

Comment on lines -408 to -487
# TODO: Potentially most of this could be ripped out. The internals
# don't use this at all, it's only used by format_class.
def _flatten_class_explicit(
cls: type[Any], ctx: _eval_typing.EvalContext
) -> type[_eval_typing._EvalProxy]:
cls_boxed = box(cls)
mro_boxed = cls_boxed.mro

# TODO: I think we want to create the whole mro chain...
# before we evaluate the contents?

# FIXME: right now we flatten out all the attributes... but should we??
# XXX: Yeah, a lot of work is put into copying everything into every
# class and it is not worth it, at all.

new = {}

# Run through the mro and populate everything
for boxed in reversed(mro_boxed):
# We create it early so we can add it to seen, to handle recursion
# XXX: currently we are doing this even for types with no generics...
# that simplifies the flow... - probably keep it this way until
# we stop flattening attributes into every class
name = boxed.cls.__name__
cboxed: Any

args = tuple(boxed.args.values())
args_str = ", ".join(_type_repr(a) for a in args)
fullname = f"{name}[{args_str}]" if args_str else name
cboxed = type(
fullname,
(_eval_typing._EvalProxy,),
{
"__module__": boxed.cls.__module__,
"__name__": fullname,
"__origin__": boxed.cls,
"__local_args__": args,
},
)
new[boxed] = cboxed

annos: dict[str, Any] = {}
dct: dict[str, Any] = {}
sources: dict[str, Any] = {}

cboxed.__local_annotations__, cboxed.__local_defns__ = get_local_defns(
boxed
)
for base in reversed(boxed.mro):
cbase = new[base]
annos.update(cbase.__local_annotations__)
dct.update(cbase.__local_defns__) # uh.
for k in [*cbase.__local_annotations__, *cbase.__local_defns__]:
sources[k] = cbase

cboxed.__defn_names__ = set(dct)
cboxed.__annotations__ = annos
cboxed.__defn_sources__ = sources
cboxed.__generalized_mro__ = [new[b] for b in boxed.mro]

for k, v in dct.items():
setattr(cboxed, k, v)

# Run through the mro again and evaluate everything
for cboxed in new.values():
for k, v in cboxed.__annotations__.items():
cboxed.__annotations__[k] = _eval_typing._eval_types(v, ctx=ctx)

for k in cboxed.__defn_names__:
v = cboxed.__dict__[k]
setattr(cboxed, k, _eval_typing._eval_types(v, ctx=ctx))

return new[cls_boxed]


def flatten_class_explicit(obj: typing.Any):
with _eval_typing._ensure_context() as ctx:
return _flatten_class_explicit(obj, ctx)


Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove this dead code as a standalone PR, please, since it might be worth recovering at some point.

@dnwpark dnwpark merged commit 60a4f35 into main Feb 11, 2026
3 checks passed
@dnwpark dnwpark deleted the overload branch February 11, 2026 20:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants