Michael,
You wrote: "I don't know how you want to use the featureCache. As I imagined
it
until now, it was just used to keep a part of your features in memory,
most of the data staying in the original file. But another solution
(yours ?) is to entirely copy the original data file in your own
inde
Sunburned Surveyor,
> I'm glad you think so. I like the term "FeatureOnDemand". Do you mind
> if I use it as the name of the light-weight feature class?
I guess I read this term from Agile's code. I don't mind who uses it and
hope alvaro zabala, the original developer of agile doesn't mind too.
Michael,
Your comments were very helpful. Please see my responses below.
You wrote: "I think that a light-weight feature class or FeatureOnDemand is
a good
solution, as well as a FeatureCache."
I'm glad you think so. I like the term "FeatureOnDemand". Do you mind if I
use it as the name of the
Hi sunburned,
I think that a light-weight feature class or FeatureOnDemand is a good
solution, as well as a FeatureCache.
I already tested Agile's scalable shapefile driver, and I'm currently
implementing something similar for GeoConcept format(a commercial gis).
It can save a lot of memory (bu
Erwan,
Can I ask why you would create a custom binary format instead of using the
standard Java serialization format? I'm interested in the performance
differences between a text-based storage format and a binary storage format.
I would be very interested in the results of your labratory. Perhaps
Hi Sunburned,
Currently, in my laboratory we are working on this problem. We are studying
two solutions :
- use GML such as a native format (eg TAB in MapInfo),
- create a specific binary format for OJ.
When we have some results, I send a mail to the OJ team.
Cheers.
R1.
On 3/29/07, Sunburn
I've been working on a solution to the problem of working with very large
datasets in OpenJUMP at home the past couple of weeks. (For those of you
that don't know, OpenJUMP reads all features in from a data source into
memory. This isn't a problem until you start working with some very large
datas