DBF is native dbase databases, used by foxpro etcetera.
But anyhow, I think what you've suggested is what I'll have to do. Take
records from all my mySQL table(s) and put Unique records (based on your
unique identifier exception code) into a temp mySQL table and then simply
transfer the records in
I have no idea what DBF is, but if the data is already in MySQL with
duplicates, you'll need to take everything I just said, and
"translate" it to DBF.
If it cannot be translated to DBF because DBF has no unique indeces,
you could, perhaps, create a TEMP table in MySQL, with the unique
index, inse
I'm exporting the data from mySQL table(s) into a dbase DBF table. The
unique index you're talking about should be in the DBF end, if I'm not
mistaken - but I'm not sure how to do that, and if that will help mySQL to
get that error and fail the second insert.
Unless I'm not getting this right.
On Mon, March 26, 2007 2:28 pm, Rahul Sitaram Johari wrote:
>> Another option would be to just create a UNIQUE INDEX on the fields
>> you think "should" be unique, and then your second insert is gonna
>> fail, and you can just ignore that.
>
> Could you possibly elaborate on this?
> Things I'm tryi
ert the id's in an
>> auxiliary
>> table and then delete the records.
>>
>> Satyam
>>
>> - Original Message -
>> From: "Rahul Sitaram Johari" <[EMAIL PROTECTED]>
>> To: "Mark" ; "PHP"
>> Sent: Frida
d in
> the subquery, thus, you will have to first insert the id's in an
> auxiliary
> table and then delete the records.
>
> Satyam
>
> - Original Message -
> From: "Rahul Sitaram Johari" <[EMAIL PROTECTED]>
> To: "Mark" ; "PHP"
Rahul Sitaram Johari wrote:
>
> Ave,
>
> It's definitely not live data, so that is not a problem at all. But I'm
> not sure I understand your method very well.
>
> I do understand getting data from both the existing DBF and the multiple
> mySQL tables into a temporary mySQL table. But if I do g
Ave,
> A better solution would be to add a column in the MySQL table, maybe call it
> "processed" with a default value of 0, and update this value to 1 with each
> row inserted. Then you are only querying records where processed=0.
> Of course this will not work if you cannot modify the MySQL tab
Rahul wrote:
> Ave,
>
> It's definitely not live data, so that is not a problem at all. But I'm
> not
> sure I understand your method very well.
>
> I do understand getting data from both the existing DBF and the multiple
> mySQL tables into a temporary mySQL table. But if I do go ahead and do
>
Ave,
It's definitely not live data, so that is not a problem at all. But I'm not
sure I understand your method very well.
I do understand getting data from both the existing DBF and the multiple
mySQL tables into a temporary mySQL table. But if I do go ahead and do that,
I guess I could write a
>
> Ave,
>
> "Three: Insert everything and remove duplicates later."
>
> Out of the suggested options, this option is sounding the most sane
> attainable on my end. I don't have a complete grip on how to accomplish
> this, but certainly sounds feasible. Let me look at ways to achieve this.
>
> Than
ahul Sitaram Johari" <[EMAIL PROTECTED]>
To: "Mark" ; "PHP"
Sent: Friday, March 23, 2007 5:24 PM
Subject: Re: [PHP] Add New Records Only!
Ave,
"Three: Insert everything and remove duplicates later."
Out of the suggested options, this option is sounding th
Ave,
"Three: Insert everything and remove duplicates later."
Out of the suggested options, this option is sounding the most sane
attainable on my end. I don't have a complete grip on how to accomplish
this, but certainly sounds feasible. Let me look at ways to achieve this.
Thanks!
On 3/23/07
Rahul Sitaram Johari wrote:
As far as I can see, there is probably only three ways to do this:
One: Make sure your dbase system contains unique primary key capability, and
use it to avoid duplicates.
Two: query for the row, if it isn't there insert it. (You'll have to deal
with concurrency with l
you could do a serialized md5 of sum of the more unique columns from
your db abd use that as a unique identifier for the row. Then you would
need to compare this 'key' before doing a insert.
Also keeping a log of the date and time of the last operation would help
as then you only need to sele
15 matches
Mail list logo