Dear Richerd ,

am moving my database from sqlite to mysql, what is the best way ? and 
keeping the ids the same 

regards,



On Friday, October 12, 2012 9:44:59 PM UTC+3, Richard wrote:
>
> Depending of your backend, for example with postgres :
>
> pg_dump --attribute-inserts 
> --file=/whereYouWantToPutTheFile/"dbName_with_inserts_commands_and_blobs_dump_"`date
>  
> +"%Y-%m-%d_%H-%M-%S"`".gz" --compress=9 --role=roleName --username=postgres 
> dbName
>
> From the command line.
>
> Or you may also use pgAdmin.
>
> Once you have the file you can just open it (depend how big is you 
> database) or pg_restore it (if to big to open it in a editor).
>
>
> Richard
>
> On Wed, Oct 10, 2012 at 3:00 AM, hasan alnator 
> <haln...@gardeniatelco.com<javascript:>
> > wrote:
>
>> Dear Simon,
>>
>> What i use the ids in my tables  so the ids changed , who can i export 
>> and import data in backup/restore mechanism as you said above ? 
>>
>> can you give me an example ? 
>>
>> regards,
>>
>> On Wed, Oct 10, 2012 at 1:58 AM, hasan alnator 
>> <haln...@gardeniatelco.com<javascript:>
>> > wrote:
>>
>>> Dear Massimo, 
>>>
>>> i have web2py 
>>> Version 1.99.4 (2011-12-14 14:46:14) stable
>>>
>>> i opend the dal.py and this is what i have in there : 
>>>
>>>  def import_from_csv_file(self, ifile, id_map=None, null='<NULL>',
>>>                              unique='uuid', *args, **kwargs):
>>>         if id_map is None: id_map={}
>>>         for line in ifile:
>>>             line = line.strip()
>>>             if not line:
>>>                 continue
>>>             elif line == 'END':
>>>                 return
>>>             elif not line.startswith('TABLE ') or not line[6:] in 
>>> self.tables:
>>>                 raise SyntaxError, 'invalid file format'
>>>             else:
>>>                 tablename = line[6:]
>>>                 self[tablename].import_from_csv_file(ifile, id_map, null,
>>>                                                      unique, *args, 
>>> **kwargs)
>>>
>>>
>>>
>>> Regards, 
>>>
>>>>
>>>> --
>>>>
>>>>
>>>>
>>>>
>>>
>>  -- 
>>  
>>  
>>  
>>
>
>

-- 



Reply via email to