Hello,
Am 03.01.11 14:14, schrieb Andre Lopes:
Hi,
Thanks for the reply's. I was tempted to accept the Rodoslaw Smogura
proposal. There will be about 100 websites to capture data on daily basis.
Each website adds per day(average) 2 articles.
Thomas talked about the noSQL possibility. What do
Hello,
Am 03.01.11 12:46, schrieb Radosław Smogura:
I can propose you something like this:
website(id int, url varchar);
attr_def (id int, name varchar);
attr_val (id int, def_id reference attr_def.id, website_id int
references website.id, value varchar);
If all of your attributes in website
Hello,
Am 03.01.11 12:11, schrieb Andre Lopes:
Hi,
I need advise about a database structure. I need to capture data from the
web about one specific subject on few specific websites and insert that data
to a database. I have done this question here before, but I think I have not
explained very
Hello,
Am 03.01.11 00:06, schrieb Adrian Klaver:
On Sunday 02 January 2011 2:22:14 pm Thomas Schmidt wrote:
well, I'm new to postgres and this is my post on this list :-)
Anyway, I've to batch-import bulk-csv data into a staging database (as
part of an ETL-"like" pocess).
Hello,
well, I'm new to postgres and this is my post on this list :-)
Anyway, I've to batch-import bulk-csv data into a staging database (as
part of an ETL-"like" pocess). The data ought to be read via STDIN,
however for keeping in simple and stupid, saving it to a file and
importing afterwar