Well, depends on the MFUs I guess. I would probably bypass web2py for this 
and simply use a javascript based polling reader based on websockets. Of 
course, you'd have to have a websockets enabler on all the MFUs
Is there any reason why you'd use a database for this? If all you are 
looking for is realtime, then there should be no need for a database. A 
simple ram cache should work for what you are trying to do. Can you program 
the MFUs? I think it would be easier for 30 MFUs to open connections to one 
server, than for one server to open 30 connections (one for each MFU). 

On Monday, February 4, 2013 12:46:22 PM UTC-7, Bernard wrote:
>
> Hi web2py users,
>    I've been using web2py for a few months now, thank you to the 
> developers for the great work.
>
>    I'm working on an interactive web based monitoring and control 
> Application that communicates with ~30 mobile field units at a time to get 
> periodic 'semi-realtime' status reports (2-5 second poll period) and allow 
> the user to change settings of the field units on demand.  The 
> communications channel is using TCP sockets: the web2py workstation end is 
> the TCP client and each field unit is running as a TCP server on an 
> embedded low performance field unit.  The front end is currently doing 
> periodic Ajax polling every 2 seconds and updating the GUI.  I also would 
> like to support multiple web users connected to the Application on the 
> front end.
>
>    I've searched the mailing lists of web2py and other frameworks but 
> could not find a use case similar to mine.  There are many ways 
> implementing this, it's not easy to figure out which is best and what 
> pitfalls may lie ahead.
> Here are some of the approaches that I have considered:
> 1- Use a background asynchronous "Data Acquisition" task always running 
> and fills a "RealTime" table in the database (by polling all field units 
> every 2 seconds). For each web request, the controller would then pick up 
> the latest values from the database and serve them up to Web clients 
> without having to worry about pulling the data. The background task keeps 
> the sockets open to improve performance.
> 2- The controller communicates with the ~30 field units directly, 
> bypassing any database overhead. The controller needs a persistent 
> reference to the 30 TCP sockets to make the comms faster. Is there a way to 
> parallelize the TCP request/response in the request thread to communicate 
> with ~30 units quickly? To handle multiple Web users, I can cache the 
> controller function so that it doesn't run on every web client request.
> 3- Have web2py controller communicate with a separate data acquisition 
> process 
> via message queues. The web2py parts would never deal with the low level 
> comms and the external data acquisition component would abstract all 
> that. However, this is at the expense of having to create an external 
> component and define the interface to it and adding a messaging framework 
> between web2py and the data acquisition process.
> 4- Controller kicks off a worker thread that collects the field unit 
> status. Controller function cached to avoid having a task for every web 
> request.
> 5- Other ideas that might be better suited to this application?
>
> If anybody has gone through something similar, can you please help with 
> your experience?
> If you see any issues or potential weaknesses in any of these approaches, 
> your feedback would be greatly appreciated.
>
> Regards,
> Bernard
>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to