__import__ or import_module versus import when linking multiple django settings files

I am trying to modulate my django settings file to make it easier to deploy and manage our multiple environments.

I have a setup to upload my files in this order ...

  • settings.py - settings common to everyone
  • config / country_XX - settings specific to the installation for this country (XX can be US, CA, AU, etc.)
  • config / developer_XX - settings specific to the local developer environment.

Secondary files are allowed to change the values ​​set in the previous files

I find that if 1 loads 2 and 3 using normal

from config.country_XX import *

      

the changes I make to these files are saved.

If, however, 1 loads 2 and 3 using

__import__() 

      

or

importlib.import_module()

      

the changes I make are not saved.

I would rather use import_module because it allows me to write cleaner code

import_module('config.country_' + country)

      

instead

if country = 'AA':
  from config.country_AA import *
elif: country == 'BB'
  from config.country_BB import *
...

      

Here's what I have ... let me know what you think.

settings.py

import os
import sys
from django.utils import importlib

DEVELOPMENT = True
DEBUG = False
USES_24_HOUR_TIME = True

country_config = 'config.country_us'
developer_config = 'config.developer_jack'
try:
    #importlib.import_module(country_config)
    from config.country_us import *

    if DEVELOPMENT:
        #importlib.import_module(developer_config)
        from config.developer_jack import *

except ImportError:
    pass

      

config / country_us.py

import sys
globals().update(vars(sys.modules['settings']))

USES_24_HOUR_TIME = False

      

config / developer_jack.py

import sys
globals().update(vars(sys.modules['settings']))

DEBUG = True

      

+3


source to share





All Articles