Replacing module with a stub for unit testing
Hi, I'm working on a unit test framework for a module. The module I'm testing indirectly calls another module which is expensive to access --- CDLLs whose functions access a database. test_MyModule --->MyModule--->IntermediateModule--- >ExpensiveModule I want to create a stub of ExpensiveModule and have that be accessed by IntermediateModule instead of the real version test_MyModule --->MyModule--->IntermediateModule--- >ExpensiveModuleStub I tried the following in my unittest: import ExpensiveModuleStub sys.modules['ExpensiveModule'] = ExpensiveModuleStub # Doesn't work But, import statements in the IntermediateModule still access the real ExpensiveModule, not the stub. The examples I can find of creating and using Mock or Stub objects seem to all follow a pattern where the fake objects are passed in as arguments to the code being tested. For example, see the "Example Usage" section here: http://python-mock.sourceforge.net. But that doesn't work in my case as the module I'm testing doesn't directly use the module that I want to replace. Can anybody suggest something? Thanks, Scott -- http://mail.python.org/mailman/listinfo/python-list
Re: Where is the correct round() method?
it could be that 3.0 is using "banker's rounding" --- rounding to the even digit. the idea behind it behind it being to reduce error accumulation when working with large sets of values. Works for me on Python 2.5 on Linux running on "Intel(R) Core(TM)2 Duo CPU". What system are you on? It could be that 2.5 is really 2.4... which would round down to 2, but on any modern CPU (using IEEE floating point), 2.5 should be representable exactly. -- http://mail.python.org/mailman/listinfo/python-list
ctypes - unloading implicitly loaded dlls
(my apologies if this is a repost, but it sure seems like the first attempt disappeared into the ether...) I'm writing a program that uses functionality from two different sets of cdlls which reside in two different directories, call them 'libA.dll' and 'libB.dll'. Although I don't directly use it, both directories contain a dll with the same name, although they aren't in fact identical. Call them, "libC.dll". However, the c-functions I call from the clls I do use seem to implicitly use "libC.dll". The problem that occurs after I load one dll and call functions in it, when I try to load the second dll I get windows errors because the second dll tries to call a function in its version of libC.dll, but it finds the version meant for libB.dll, which doesn't contain that function. Oy, I hope some sample code makes it clearer: def demo(): A = ctypes.cdll.LoadLibrary('/path1/libA.dll') A.foo() # implicitly uses '/path1/libC.dll' _ctypes.FreeLibrary(A._handle) # CRASH! B = ctypes.cdll.LoadLibrary('/path2/libB.dll') # "The procedure entry point some_func could not be located # in the dynamic link library libC.dll.": # libB.dll wants to use code from '/path2/libC.dll', but # instead it finds '/path1/libC.dll' already loaded # in memory, which doesn't # contain the function call it wants. Assuming my understanding of things is correct, then I believe what I need to do is to remove /path1/libC.dll from memory before I try loading libB.dll, but I haven't found any way of doing that. Can anyone offer my some suggestions? Or, am I S.O.L.? Notes: * the two sets of dlls are supplied by a vendor for working with its COTS packages; I don't have any control over the dll names used or the code therein. * If I leave out the call to A.foo(), then I don't crash, but if I leave out the FreeLibrary call as well then I do crash. * I've tried manipulating the PATH before loading the dlls, to no effect. * I've tried del'ing A and running gc.collect() before loading B. Thanks, Scott -- http://mail.python.org/mailman/listinfo/python-list
Re: ctypes - unloading implicitly loaded dlls
Nick Craig-Wood wrote: You could try loading C explicitly with ctypes.LoadLibrary() before loading A, then you'll have a handle to unload it before you load B. I did think of that, but no luck. Guess the cdll doesn't look for a dll loaded already by python. I guess that does make sense. I think I'd probably split the code into two or three processes though. Perhaps use http://pypi.python.org/pypi/processing to communicate between them. That should get you out of DLL Hell! (Don't load any of the DLLs before you start the worker processes off.) That was quite a helpful suggestion, thank you. I had been using the subprocess module actually, but really didn't like that approach. processing is much nicer. Pipes, in particular, is quite handy. ~Scott -- http://mail.python.org/mailman/listinfo/python-list
Re: static variables in Python?
[EMAIL PROTECTED] wrote: kj: OK, I guess that in Python the only way to do what I want to do is with objects... There are other ways, like assigning the value out of the function, because Python functions too are objects: ... But I suggest you to use a class in this situation, it's often the way that will keep your code more bug-free, and more readable by near- casual readers too. Python philosophy asks you to write readable code instead of clever code when possible, this is a difference from Perl, I presume. Bye, bearophile Here's a solution using decorators, I like it, but I'm biased: def staticAttrs(**kwds): """ Adds attributes to a function, akin to c-style "static" variables """ def _decorator(fcn): for k in kwds: setattr(fcn, k, kwds[k]) return fcn return _decorator @staticAttrs(n=0) def rememberCalls(): """ >>> rememberCalls() 0 >>> rememberCalls() 1 >>> rememberCalls() 2 """ print rememberCalls.n rememberCalls.n += 1 ~Scott -- http://mail.python.org/mailman/listinfo/python-list
pyprocessing/multiprocessing for x64?
I recently learned (from I response on this newsgroup to an earlier query) of the processing module for working with subprocesses in a similar manner to threading. For what I needed to do, it worked great --- until I tried to run my code on an x64 box, for which that module isn't available*. So, I'm just wondering if when processing is renamed to multiprocessing and included in the standard lib for 2.6, will x64 be supported? ~Scott *yes, yes, I know. download the source and compile it myself. -- http://mail.python.org/mailman/listinfo/python-list
Re: pyprocessing/multiprocessing for x64?
Interesting, I see Christian's responses to Benjamin, but not Benjamin's posts themselves. Anyways, the question remains: will multiprocessing be supported for the x64 platform when it's released in 2.6? pigmartian wrote: I recently learned (from I response on this newsgroup to an earlier query) of the processing module for working with subprocesses in a similar manner to threading. For what I needed to do, it worked great --- until I tried to run my code on an x64 box, for which that module isn't available*. So, I'm just wondering if when processing is renamed to multiprocessing and included in the standard lib for 2.6, will x64 be supported? ~Scott *yes, yes, I know. download the source and compile it myself. -- http://mail.python.org/mailman/listinfo/python-list