Apparently nobody has proposed this yet: >>>filter(letters.__contains__, astr) 'Bad'
>>>filter(set(letters).__contains__, astr) 'Bad'
Everyone is seeking early PEP 3000 compliance ;-)
filter wins on conciseness - it's short enought to use in-line, but for a fair speed comparison, I've wrapped it in a function, below; str.translate is far ahead on speed for all but the shortest strings:
def func_translate1(s, letters, table=string.maketrans("","")): return s.translate(table, table.translate(table, letters))
def func_filter1(s, letters): in_set = letters.__contains__ return filter(in_set, s)
def func_filter2(s, letters): in_set = set(letters).__contains__ return filter(in_set, s)
>>> for m in (1, 10, 100, 1000, 10000): ... s = "Bob Carol Ted Alice" * m ... letters = "adB" ... print "List length: %s" % len(s) ... print shell.timefunc(func_translate1, s, letters) ... print shell.timefunc(func_filter1, s, letters) ... print shell.timefunc(func_filter2, s, letters) ... List length: 19 func_translate1(...) 64179 iterations, 7.79usec per call func_filter1(...) 63706 iterations, 7.85usec per call func_filter2(...) 45336 iterations, 11.03usec per call List length: 190 func_translate1(...) 54950 iterations, 9.10usec per call func_filter1(...) 12224 iterations, 40.90usec per call func_filter2(...) 10737 iterations, 46.57usec per call List length: 1900 func_translate1(...) 22760 iterations, 21.97usec per call func_filter1(...) 1293 iterations, 386.87usec per call func_filter2(...) 1184 iterations, 422.52usec per call List length: 19000 func_translate1(...) 3713 iterations, 134.67usec per call func_filter1(...) 137 iterations, 3.67msec per call func_filter2(...) 124 iterations, 4.05msec per call List length: 190000 func_translate1(...) 426 iterations, 1.18msec per call func_filter1(...) 14 iterations, 38.29msec per call func_filter2(...) 13 iterations, 40.59msec per call >>>
Michael
-- http://mail.python.org/mailman/listinfo/python-list