There's no functionality similar to v2's fromEncoded in v3 yet.  While
the encoding does do polyline simplification for loading, I'd suggest
you adapt a similar technique (ala the Ramer-Douglas-Peucker
algorithm) to your current dataset.  It'd be better for your servers
and your users if you're not throwing 75k points at the maps api.
While I get the idea of the technique, I won't pretend to fully
understand all of the math behind it. I didn't pay that much attention
in geometry class unfortunately.  But I think there's enough code
samples out there to get you where you need to go...


On Nov 17, 5:22 pm, hvr <[email protected]> wrote:
> Thanks for the link.  Do the encoding examples you've linked to work
> with V3 of the API?
>
> On Nov 17, 4:04 pm, arclyte <[email protected]> wrote:
>
> > For encoding, check out:
>
> >http://facstaff.unca.edu/mcmcclur/GoogleMaps/EncodePolyline/
>
> > There's code there for a  few different languages.
>
> > I think you'd definitely want to simplify the polygons prior to even
> > using them in production.
>
> > On Nov 16, 5:19 pm, hvr <[email protected]> wrote:
>
> > > Hi There,
>
> > > I'm responsible for a project where we are mapping polygons on a
> > > google map.  my dilemma is that the data points i've been given are
> > > really really accurate. Some polygons are larger than 75,000 points.
> > > Using V3 of the API what are my options for smoothing these polygons?
>
> > > I notice that V2 of the API has an encoding function which will do
> > > this but no such feature exists for V3.  I've also investigated using
> > > 3rd party libraries but these all seem to be written for V2.

--

You received this message because you are subscribed to the Google Groups 
"Google Maps JavaScript API v3" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-maps-js-api-v3?hl=.


Reply via email to