Thanks for the insight Russ.  I appreciate it.

On May 31, 3:09 am, "Russell Keith-Magee" <[EMAIL PROTECTED]>
wrote:
> On Thu, May 29, 2008 at 10:14 PM, jfinke <[EMAIL PROTECTED]> wrote:
>
> > The problem then becomes is there any easy to sync the notebook
> > information back with HQ when the notebooks come home or VPN in or
> > whatever?  Do I need to write a program to export it out and then
> > import it back in?  Or are there already existing features that handle
> > something like disconnected clients?
>
> To the best of my knowledge, there isn't an 'out of the box' solution
> for this. However, it can be done - I've been involved in the
> development of an implementation of a system that is almost identical
> to the one you describe.
>
> In the implementation I was working on, the architecture went
> something like this:
>
>  * There was a single set of models; the main server ran Postgres, and
> kept the canonical copy. The clients (tablet PCs and handhelds) ran
> SQLite versions in their own local server.
>
>  * On first connection, the client would poll an 'export' URL on the
> server that would pull down, in XML format, a dump of the relevant
> table content from the server which would be loaded into the client's
> local SQlite database. The server keeps track of which rows of data
> (specifically, which primary keys - more on this later) have been sent
> to the client.
>
>  * The client would then operate as a normal Django application, using
> the local SQLite database. Users use a web application running against
> a local server; there is no need for a connection between the main
> server and the client machine.
>
>  * When the client reconnected to the network, it would POST a
> serialized version of the local data back to the server via an
> 'import' URL.
>
>  * The real magic came on the server side. Since there can be multiple
> clients connecting to the server, each client could be producing
> database entries with conflicting primary keys. As a result, the
> server can't just accept the POSTed data as-is - it needs to maintain
> an internal accounting table that correlates the primary key on the
> client (as provided in the data upload) with the primary key that
> exists locally on the server. If a new record is found in the POSTed
> data, a new record is created on the server, and the accounting table
> stores a record that correlates the server's canonical primary key
> with the primary key received from the client.
>
> Essentially, the server maintains a complete picture of all data from
> all clients, plus a picture of the primary keys that are available on
> each individual client. Whenever the server sends data is sent to the
> client, the primary keys are normalized to match the expectations of
> the client, and whenever data is received from the client, the primary
> keys are normalized  back into the canonical server representation.
>
> Based on my experiences, I suspect that it is something that could be
> turned into a generic tool/utility which would make a useful
> contribution to the community. Unfortunately, the implementation that
> I worked on probably won't ever see the outside world as a generic
> tool (due to a combination of factors, including the customer
> involved, and the fact that I no longer work at the company that
> developed the product).
>
> Yours,
> Russ Magee %-)
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to