Hello,
I've got a table like this:
db._common_fields.append(auth.signature)
db.define_table('locations',
Field('title', 'string'),
Field('capacity', 'integer'),
Field('celcat_code', 'string'),
Field('is_external', 'boolean', default
Hi
i created i table definition in db.py by using csvstudio.py, but i cannot
load the values through db.table.import_from_csv_file becouse the header in
the csv have spaces and uppercases.
So there a way to tell import_from_csv_file how to map column in csv with
that in table definition?
By the
hi
Does import_from_csv_file only operate insert sql statement or it could
also operate update record? In this case would it be enough the presence of
unique column in the csv(without id) for it?
Thanks
--
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.c
I recently learned (by searching through the google-group) the usefulness
of the id_map={} parameter in the db.import_from_csv_file function.
I was wondering why this is not included in the documentation of web2py?
Is this on the 'todo' list or is there another reason why it is not
included?
Be
Assistance, anyone? Thanks.
--
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
---
You received this message because you are subscribed to the Google Groups
"w
Hi again :)
I would like to import a CSV file but set the unique field/column to
*full_name_code* (instead of *uuid*).
So here's what I did:
def import_csv():
form = SQLFORM.factory(
Field('csv_file','upload',uploadfield=False)
)
if form.process().accepted:
ff = r
Using db.import_from_csv_file(...) to restore a complete database from a
file created using db.export_to_csv_file(...) used to work.
With 2.8.2 the process (at least in the development environment - I haven't
tried in production environment) loops forever.
The problem seems to be in gluon\dal.p
Is there something like validate_and_insert() for import_from_csv_file()?
I would like to update my database with csv files, but have the form
validators fired to ensure that the data inserted is of the correct type.
--
The project I will be working on requires all users to import from
spreadsheets, so thats not really possible.
On Wednesday, June 20, 2012 9:49:57 AM UTC-5, Richard wrote:
>
> Why don't use the appadmin export and import ?
>
> Richard
>
> On Wed, Jun 20, 2012 at 9:55 AM, joe wrote:
>
>> I want t
Why don't use the appadmin export and import ?
Richard
On Wed, Jun 20, 2012 at 9:55 AM, joe wrote:
> I want to import from a csv file, but I want to use the imported data in
> only one of my database tables, not all of them, which is how I have seen
> the examples online. Here is my code, whic
I want to import from a csv file, but I want to use the imported data in
only one of my database tables, not all of them, which is how I have seen
the examples online. Here is my code, which does not work:
Model
---
db.define_table(
'upload',
Field('name'),
Field('e
I exported a table from sqlite with:
open(filename, 'w').write(str(db(db.category.id).select()))
Output file looks as expected.
And then I tried importing into postgres with:
db.category.import_from_csv_file(filename)
Each row was inserted but all values are NULL.
Any ideas?
Version 1.99.4
In my application which is hosted on GAE I have backup/restore
functions which use export_to_csv_file/import_from_csv_file.
While this is not a practical backup/restore for GAE (restore would
take far too long and timeout, so for this purpose I use GAE bulk
loader tool), I have used it in the past
Hello,
I have Topics Tree like this
db.define_table("Topics",
Field("parent_id","reference Topics", label=T("Parent
Topic"),
#~ widget=my_hierarchical_options_widget, # probably
overriden by requires
requires=IS_EMPTY_OR(IS_IN_DB(db,
'Topics.id','Topics.n
14 matches
Mail list logo