Reputation: 453
I'm doing a research project on detecting breaking changes from Python library upgrades. One of the steps is to extract the difference between two major versions of the same Python library by using static analysis(Coule be AST-based or not), in order to triage the pattern of change. The detection should not only find the difference from .py files, but also the difference from other project files including config files, resources, etc. Ideally, a scenario like if a .py file moved to another module should also be included. So I have two questions here:
Sorry, this might be a silly question, I'm not coming from a Python background, really running out of thoughts here. Any thoughts, ideas, and inputs are welcome. Thanks in advance.
Upvotes: 1
Views: 96
Reputation: 3189
Just spit balling some ideas here:
I don't think I'd be so concerned about detecting changes in the source files up front. There are a lot of ways to move code around among files without changing the interface to the module. For example you can put all of the code in __init__.py
or, you can split it up into any number of files and subdirectories. However, the programmatic interface will stay the same.
Instead, you could use the dir()
built-in to detect changes in the public classes and methods in the module. This will work well for libraries that used named arguments, but won't work well for functions which just use def func(*args, **kwargs)
(this is why that should be avoided, all you former perl programmers!)
If the module uses the new type hinting, you can really get some mileage out of detecting change in types. If you use some tool that actually parses the python and infers types, that would work as well. I would guess VSCode probably contains such a library that it uses to give context-sensitive help.
Upvotes: 1