I have a python package named foo
, which i use in imports:
import foo.conf
from foo.core import Something
Now i need to rename the foo
module into something else, let's say bar
, so i want to do:
import bar.conf
from bar.core import Something
but i want to maintain backward compatibility with existing code, so the old (foo.
) imports should work as well and do the same as the bar.
imports.
How can this be accomplished in python 2.7?
This forces you to keep a foo
directory, but I think it the best way to get this to work.
Directory setup:
bar
├── __init__.py
└── baz.py
foo
└── __init__.py
foo_bar.py
bar/__init__.py
is empty.
bar/baz.py
: worked = True
foo/__init__.py
:
import sys
# make sure bar is in sys.modules
import bar
# link this module to bar
sys.modules[__name__] = sys.modules['bar']
# Or simply
sys.modules[__name__] = __import__('bar')
foo_bar.py
:
import foo.baz
assert(hasattr(foo, 'baz') and hasattr(foo.baz, 'worked'))
assert(foo.baz.worked)
import bar
assert(foo is bar)
It not works properly. If module have submodules then they are not the same.
For a reliable solution that includes submodules, you would need to explicitly add aliases for each submodule to sys.modules as well. This works (I've tried it on Python 3.6/3.7 so even recent versions), and successfully avoids the subtle bug where you end up with multiple instances of the same module with different names (which can cause nightmares if you have per-module static state, or isinstance() checks which get confused if there are multiple versions of the same class with different names). Works better than the MetaPathFinder approach.