[PEAK] imports (Re: Package organization)
alexander smishlajev
alex at ank-sia.com
Thu Dec 4 12:30:00 EST 2003
Phillip J. Eby wrote, at 04.12.2003 16:13:
>> i would prefer peak.api (or whatever name is chosen for such module)
>> to export only core - no primitives, no frameworks.
>>
>> however, this issue does not affect me or my colleagues since the
>> coding policy in our company is to avoid 'from module import *' in
>> favor of 'import module' (better) or 'from module import symbol1,
>> symbol2'. so we never do 'from peak.api import *'.
>
> I'm confused, on two points. First, why do you care what it exports if
> you're not going to use import *?
that's probably because of my bad english. when i said "this issue does
not affect me", i meant that actually i do not care. but if i would
care, i would like to avoid namespace pollution.
> Second, if it doesn't export the primitives (especiall NOT_GIVEN
> and NOT_FOUND), where will you get them from?
and that's because i misread what the primitives are. somehow protocols
and adapt captured my attention. protocols is an independent module,
and my personal preference is to import it explicitely, not to get as a
bonus from peak core api. of course, this makes sense only if the
export semantics change from "get all of PEAK" to "get things required
to work with PEAK", i.e. if frameworks and other components are not
exported by default.
>> first, there is inconsistence between LazyModule and _loadAndRunHooks:
>> if any of the hooks fail, _loadAndRunHooks still believes that the
>> module is imported (by disabling all postLoadHooks for that module),
>> but LazyModule thinks that the module is *not* imported, and tries to
>> _loadAndRunHooks again upon the next attribute access. this leads to
>> AlreadyRead exception.
>
> True. And then the module concludes it's *still* not imported.
> Unfortunately I don't think there's much I can do about this, except
> maybe allow the hooks to be run more than once, which I don't really
> like.
maybe just remove hooks that ran successfully? and then isolate hooks
effects by taking a copy of the module __dict__ and restore it if the
hook raised an exception?
i am not sure if such approach is implementable; these are just crazy ideas.
>> module = imp.new_module(fqname)
>> exec code in module.__dict__
>>
>> this launches _loadAndRunHooks upon access to module.__dict__ (or any
>> special import attribute, like '__importer__' or '__ispkg__'), when
>> the module code has not been executed, and always leads to the problem
>> shown in above example.
>
> I'm a little puzzled as to how they get to that point, unless
> imp.new_module is returning the module from sys.modules.
it is not. the module created by imp.new_module does not appear in
sys.modules.
> I suppose I
> could make the __getattribute__ avoid loading on access to attributes
> that begin and end with double underscores.
this may behave somewhat better, but still there are things like
config.interfaces accessing own IConfigKey via indirect recursion in the
very first line of code, when IConfigKey is not created yet.
> But, it's possible that for
> some attributes, this would be a bad thing. For example, a module's
> __conform__ method would be accessed by adapt(). Without a
> comprehensive list of which attributes to ignore, it would be unlikely
> to be correct in all cases. I could add such a list of attributes known
> to be read by importers, I suppose.
i am pretty sure about three names: "__dict__", "__ispkg__" and
"__importer___". i am not saying that i want this ASAP; as i said
before, we already found a solution acceptable for us.
>> for my application packaging, i found a workaround to overcome peak
>> module lazyness, so these problems do not bite me much yet.
>
> Just for my information, could you tell me what that is?
sure.
i gave up on using some binary packaging system, like marshalled modules
in McMillian installer or compiled modules and imputil hooks in py2exe.
instead, i decided to distribute source code for python modules. all of
required modules for the application are packaged with InnoSetup, thus
eliminating support problems arising from multiple external
dependencies. additional advantage of the source distribution is that
the source code is on hand when a programmer has to solve some problem
off-site (sometimes there are problems that cannot be tracked down in
our office).
i use Analysis from McMillian installer to find most of used modules.
some modules are not found this way, and i list those names in a
modulefinder hook as 'hiddenimports'. three PEAK modules still cannot
be found by Analysis: peak.binding.api, peak.running.api,
peak.interface; i add them to the result TOC manually.
before running Analysis i do
from peak.api import binding, config, naming, running
to let the modulefinder use already imported modules; when Analysis is
run, all peak lazy evaluations are done beforehand. i also import all
modules listed in 'hiddenimports', for the same reason.
when the Analysis is done, i use it's results to build TOCs for COLLECT
step. after minor tweaking of COLLECT results i end up with a binary
wrapper (.exe) for the main application script and a 'lib' subdirectory
containing all needed python modules (both .py and .pyd) as well as
python dll. these .exe and lib are then packaged with InnoSetup.
if you are interested, i may pass all sources involved in the process:
installer .spec file, module hook, executable stub and the script making
an executable from this stub.
best wishes,
alex.
More information about the PEAK
mailing list