That's two connections at the process level, so that is not the limit for each machine. For example, in hardware, each server machine may have eight network cards on PC motherboards (by what was on the market), so each card can have 12 connections for each server process, and each client process has 2 connections, and each network has 254 connections (IPv4), so there is leverage for maximum throughput up to the switch.
On Monday, July 30, 2012, Oz Linden (Scott Lawrence) wrote: > On 2012-07-29 04:07 , Henri Beauchamp wrote: > > With just a few tweakings to the texture fetcher (and in particular > a setting that permits up to 32 concurrent HTTP fetches), > > > Caution... once we begin supporting persistent connections and pipelined > requests on the server side (we're working on it), opening that many > connections will very likely be considered abuse of the service. Allowing > a single server to have that many potentially long-lived connections to the > same server would starve the file descriptor pool very quickly at the > server end, adversely affecting other users. > > HTTP itself recommends that clients not maintain more than 2 connections > to the same service [1]. I don't know exactly what limit we will decide is > reasonable (I expect that 2 will be ok, but don't know whether or not some > larger number will be also). > > Please bear this in mind as you think about your designs. I will keep you > all informed as things develop. > > > [1] RFC 2616, section 8.1.4<http://tools.ietf.org/html/rfc2616#section-8.1.4> > >
_______________________________________________ Policies and (un)subscribe information available here: http://wiki.secondlife.com/wiki/OpenSource-Dev Please read the policies before posting to keep unmoderated posting privileges