On 02Apr2019 13:14, DL Neil <pythonl...@danceswithmice.info> wrote:
On 2/04/19 11:57 AM, Cameron Simpson wrote:
On 29Mar2019 09:32, DL Neil <pythonl...@danceswithmice.info> wrote:
Do you 'keep' these, or perhaps next time you need something you've 'done before' do you remember when/where a technique was last used/burrow into 'history'?
(else, code it from scratch, all over again)

I didn't say it before, but I _loathe_ the "files with code snippets" approach. Hard to search, hard to reuse, promotes cut/paste coding which means that bugfixes _don't_ make it into all the codebases, just the one you're hacking right now.

I put them into modules for import. I've got a tree of Python modules named "cs.this", "cs.that" for various things. Then just import stuff like any other library module.

Agreed - but Python doesn't (natively) like imports from a different/parallel/peer-level dir-tree. (see also Apache httpd)

Well, you have several options:

- modify $PYTHONPATH

- make a virtualenv

- make a symlink; Python imports from the current directory - not my favourite feature, but very handy in the dev environment

For personal projects (if they aren't just part of that tree) I just need to symlink the tree into the local Python library as "cs".

I was doing this.

This works really well for me. In particular, in my development tree I symlink to my _dev_ "cs" tree, so that if I modify things (fix a bug, add a feature etc) to a cs.* module the change is right there in the repo where I can tidy it up and commit it and possibly publish it.

Much of my work is a simple-delivery, ie copying code from my dev m/c to the client's PC or server - so I don't 'package it up'* because of the (perceived) effort required cf the one-to-one (or maybe a couple) machine relationships.

I'm doing something similar right now, but where I've used a cs.* module in their code it will go through PyPI as part of the prepare-for-deploy process so that they can reproduce the build themselves.

In my case the deploy.sh script makes a directory tree with a copy of the released code (from a repo tag name - "hg archive" for me, there's an equivalent git command). That tree contains a prep.sh script which runs on the target machine to build the virtualenv and likewise the javascript side which is used for the web front end.

So deploy for me is:

- get the code ready (committed, tagged) at some suitable good phase

- run "./deploy.sh release-tag" which makes a deployable tree

- rsync onto the target (into a shiny new tree - I'll twiddle a symlink when it goes "live")

- run "./prep.sh" in the target, which fetches components etc

The advantage of the prep.sh step is that (a) the target matches the target host (where that matters, for example the local virtualenv will run off the target host Python and so forth), and _also_ (b) it serves as doco for me on how to build the app: if prep.sh doesn't work, my client doesn't have a working recipe for building their app.

I still need to add some prechecks to the deploy, like "is the venv requirements file commented to the repo" etc.

However, during 'delivery' to another machine, have to remember to rsync/copy including the symlink, as well as to transfer both dir-trees.

By making a local virtualenv and getting my modules through PyPI (pip install) this issue is bypassed: there's the client code library (rsynced) and the third party modules fetched via PyPI. (Not just my cs.* modules if any, but also all the other modules required.)

Recently, stopped to organise the code into (more) modules (as also suggested here) and moved to adding the 'utilities' directories into PYTHONPATH. Now I have to remember to modify that on the/each target m/c!

Script it. Include the script in the rsync.

If I get something well enough defined and sufficiently cleaned up for use beyond my own code (or just good enough that others might want it), up it goes to PyPI so that it can just be "pip install"ed.

So I've got this huge suite of modules with stuff in them, and a subset of those modules are published to PyPI for third party reuse.

Am dubious that any of the 'little bits' that I have collected are sufficiently worthy, but that's 'proper' F/LOSSy thinking!

If you're reusing your little bits then they need organisation into modules so that you can include them with the main code without treading on others' namespaces.

Publishing to PyPI is a very very optional step; the critical thing is breaking stuff up into modules; rsyncing them is then easy and you have a common codebase which _were formerly_ snippets for reuse.

Also, putting them in modules and using them that way forces you to make you snippets reusable instead of cut/patse/adaptable. Win win.

*will be interested to hear if you think I should stop 'being lazy' and invest some time-and-effort into learning 'packaging' options and do things 'properly'?professionally...

There's real pain there. The first time I did this (2015?) I basicly spent a whole week right after new year figuring out how to make a package and push it up; the doco was fragmented and confusing, and in some cases out of date, and there is/was more that one way to do things.

These days things are somewhat cleaner: start here:

 https://packaging.python.org/

One of the points which intrigue me is that my colleagues don't keep snippets/a library, preferring to remember (hah!) when/where they used particular techniques in the past, and copying/duplicating, to fit the new system's requirements. Am wondering if this is true beyond our little band?

Among random people I suspect it is quite common. In larger teams you accrue a utilities library which contain this stuff.

Yet, here on the list, interest was shown in 'being organised' (even if few have actually weighed-in)...

Get your reused code organsed - it is a huge win. At least make a little library of modules in its own namespace (so you can just import it without conflict). What automatic/published/distribution processes you follow after that is your choice. But having that library? Invaluable.

Cheers,
Cameron Simpson <c...@cskk.id.au>
--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to