Hi Eliezer, Its mostly like a live feed.
I am writing these sites+(a client tracking parameter) to a flat file via squid, from where another process reads it & does further processing (eg. analyze top sites used by any particular client). And that is why i was working on getting just the urls entered by clients. Ambadas On Tue, Oct 13, 2015 at 2:01 PM, Eliezer Croitoru <elie...@ngtech.co.il> wrote: > Hey Ambadas, > > I was wondering if you want it to be something like a "live feed" or just > for logs analyzing? > > Eliezer > > On 09/10/2015 15:47, Ambadas H wrote: > >> Hi, >> >> I am using below setup: >> Squid proxy 3.5.4. >> CentOS 7.1 >> >> I am trying to analyze the most used websites by the users via Squid >> proxy. >> I just require the first GET request for that particular browsed page page >> & not the proceeding GETs of that same page. >> >> Eg: >> 1) user enters *http://google.com <http://google.com>* in client >> (mozilla) >> 2) client gets page containing some other urls >> 3) client initiates multiple GETs for same requested page without users >> knowledge >> >> I myself tried a logic where I assumed if "Referer" header is present, >> then >> its not the first GET but a proceeding one for same requested page. >> >> I know i cant rely on "Referer" header to be always present as its not >> mandatory. But >> I want to know if my logic is correct? & also if there's any alternative >> solution? >> >> >> >> _______________________________________________ >> squid-users mailing list >> squid-users@lists.squid-cache.org >> http://lists.squid-cache.org/listinfo/squid-users >> >> > _______________________________________________ > squid-users mailing list > squid-users@lists.squid-cache.org > http://lists.squid-cache.org/listinfo/squid-users >
_______________________________________________ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users