Tanks Andrea for your input, it seems hopeful.
Our goal is to make geoserver work wiith MS SQL 2008 where data is stored as
nvarchar which has led us to use GeoServer Nightly Build. Is there any
experiece out there of the difference between POSTGRES and MSSQL from a spatial
point of view. I try to "GetFeature" with a bounding box and the question that
runs on the databaseserver is:
SELECT "GlobalUniqueIdentifier", ..and 20 other columns..
,CAST("PointCoordinate".STSrid as VARCHAR) + ':' + "PointCoordinate".STAsText()
as "PointCoordinate"
FROM "DataTable"
WHERE "PointCoordinate".Filter(geometry::STGeomFromText('POLYGON ((61 13, 61
13.5, 61.5 13.5, 61.5 13, 61 13))', 4326)) = 1
The method Filter is supposed to be very fast but after 1 h 40 mns i get all
memory on the disk eaten up by swapping (MSSQL). I study the estimated
execution plan and it turns out MSSQL does not which to use the spatial index I
made. As far as i can see i cannot "edit" the question that comes from
Geoserver which is MS sulotion to the problem (using the WITH statement).
Should I create the layer with an sql statement from my store?
All/any input/ideas is helpful at this point.
/maria
________________________________
Från: [email protected] [[email protected]] för Andrea Aime
[[email protected]]
Skickat: den 1 december 2011 10:15
Till: Maria Ripa
Kopia: [email protected]
Ämne: Re: [Geoserver-users] Large data sets
On Thu, Dec 1, 2011 at 8:46 AM, Maria Ripa
<[email protected]<mailto:[email protected]>> wrote:
Hi List,
We have large datasets of points in a database. We need to publish our data via
WFS in GeoServer. I would like to know what is considered to be large from a
GeoServer point of view. Is there a breakpoint where the amount of rows starts
to be too many? I have 30 Million points wich seems to be problematic, (300
000 points is no problem though).
I've seen installations serving up to 500 million polygons. GeoServer streams
out the results, so it never uses much
of a memory footprint, what is slow normally, with these data amounts, is the
extraction from the database, and
for that one has to tune the database itself, either playing with the query
planner tunables to make it use the spatial
indexes, cluster the tables, cluster the database itself on enough machines,
and so on.
When growing that large there is no single recipe I think, what needs to be
done is different depending on the
data, usage patterns and so on.
Cheers
Andrea
--
-------------------------------------------------------
Ing. Andrea Aime
GeoSolutions S.A.S.
Tech lead
Via Poggio alle Viti 1187
55054 Massarosa (LU)
Italy
phone: +39 0584 962313
fax: +39 0584 962313
http://www.geo-solutions.it
http://geo-solutions.blogspot.com/
http://www.youtube.com/user/GeoSolutionsIT
http://www.linkedin.com/in/andreaaime
http://twitter.com/geowolf
-------------------------------------------------------
------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure
contains a definitive record of customers, application performance,
security threats, fraudulent activity, and more. Splunk takes this
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
_______________________________________________
Geoserver-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/geoserver-users