On Wed, 2011-11-02 at 19:47 -0400, Jason Pruim wrote:
> Jason Pruim
> li...@pruimphotography.com
>
>
>
> On Oct 31, 2011, at 7:52 PM, Ashley Sheridan wrote:
>
> > On Mon, 2011-10-31 at 19:29 -0400, Jason Pruim wrote:
> >>
> >> Jason Pruim
> >> li...@pruimphotography.com
> >>
> >>
> >>
> >>
Jason Pruim
li...@pruimphotography.com
On Oct 31, 2011, at 7:52 PM, Ashley Sheridan wrote:
> On Mon, 2011-10-31 at 19:29 -0400, Jason Pruim wrote:
>>
>> Jason Pruim
>> li...@pruimphotography.com
>>
>>
>>
>> On Oct 31, 2011, at 7:11 PM, Jim Lucas wrote:
>>
>> > On 10/24/2011 5:50 PM, Jason
On Mon, 2011-10-31 at 19:29 -0400, Jason Pruim wrote:
> Jason Pruim
> li...@pruimphotography.com
>
>
>
> On Oct 31, 2011, at 7:11 PM, Jim Lucas wrote:
>
> > On 10/24/2011 5:50 PM, Jason Pruim wrote:
> >> Now that I've managed to list 3 separate programming languages and
> >> somewhat tie it b
Jason Pruim
li...@pruimphotography.com
On Oct 31, 2011, at 7:11 PM, Jim Lucas wrote:
> On 10/24/2011 5:50 PM, Jason Pruim wrote:
>> Now that I've managed to list 3 separate programming languages and somewhat
>> tie it back into php here's the question...
>>
>> I have about 89 million records
On 10/24/2011 5:50 PM, Jason Pruim wrote:
> Now that I've managed to list 3 separate programming languages and somewhat
> tie it back into php here's the question...
>
> I have about 89 million records in mysql... the initial load of the page
> takes 2 to 3 minutes, I am using pagination, so I h
Jason Pruim
li...@pruimphotography.com
On Oct 27, 2011, at 4:08 PM, Nathan Nobbe wrote:
> On Mon, Oct 24, 2011 at 6:50 PM, Jason Pruim
> wrote:
> Now that I've managed to list 3 separate programming languages and somewhat
> tie it back into php here's the question...
>
> I have about 89 mi
Jason Pruim
li...@pruimphotography.com
On Oct 27, 2011, at 2:44 PM, Tommy Pham wrote:
> On Wed, Oct 26, 2011 at 5:47 PM, Jason Pruim
> wrote:
>
> Jason Pruim
> li...@pruimphotography.com
>
> The server that's running it is a home computer with a VPS installed... It's
> not my dev environm
>
> Good luck, that's a LOT of reading. I'd estimate that's about 3k+ pages
> of
> reading. :)
>
>
>> >
>> >
>> > Regards,
>> > Tommy
>>
>>
>
nice to see someone else is finally getting the point that I'm been making.
--
PHP General Mailing List (http://www.php.net/)
To unsu
On Mon, Oct 24, 2011 at 6:50 PM, Jason Pruim wrote:
> Now that I've managed to list 3 separate programming languages and somewhat
> tie it back into php here's the question...
>
> I have about 89 million records in mysql... the initial load of the page
> takes 2 to 3 minutes, I am using pagination
On Wed, Oct 26, 2011 at 5:47 PM, Jason Pruim wrote:
>
> Jason Pruim
> li...@pruimphotography.com
>
> The server that's running it is a home computer with a VPS installed...
> It's not my dev environment :)
>
>
Home computer used for a production environment? Wow.. I'm speechless.
> The informat
Jim Giner wrote:
> "David Robley" wrote in message
> news:49.50.34068.1b567...@pb1.pair.com...
>>
>> Consider running EXPLAIN on all your queries to see if there is something
>> Mysql thinks could be done to improve performance.
>>
>
> Why do so many responders seem to think the problem here is
Jim Giner wrote:
Your boss wants to give access to phone numbers to the public in general?
Then what?
Glad mine's unlisted.
Is it?
Does it start 518248 ?
I often forget to get a phone number when a parcel has to go by carrier, and
paypal does not include that info, but only rarely does one no
Jason Pruim
li...@pruimphotography.com
On Oct 26, 2011, at 9:09 PM, Jim Giner wrote:
> Your boss wants to give access to phone numbers to the public in general?
> Then what?
>
> Glad mine's unlisted.
There's no identifying information on the phone numbers... Simply just the
phone number...
Your boss wants to give access to phone numbers to the public in general?
Then what?
Glad mine's unlisted.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Jason Pruim
li...@pruimphotography.com
On Oct 26, 2011, at 6:23 AM, Tommy Pham wrote:
> On Wed, Oct 26, 2011 at 1:40 AM, Lester Caine wrote:
>
>> Tommy Pham wrote:
>>
>>> I wonder ... The real question is what's the purpose of the DB? Is it for
>>> OLAP
>>> or OLTP? ;)
>>> As for dealing w
On Wed, Oct 26, 2011 at 4:14 AM, Lester Caine wrote:
> Tommy Pham wrote:
>
>>
>>Many of my customers have coming up on 20 years of data available.
>> There has
>>been a debate on transferring historic data to a separate database, but
>>having it available is not causing a problem, exc
Tommy Pham wrote:
Many of my customers have coming up on 20 years of data available. There has
been a debate on transferring historic data to a separate database, but
having it available is not causing a problem, except for some counts and
larger search actions, and being able to
On Wed, Oct 26, 2011 at 3:23 AM, Tommy Pham wrote:
> On Wed, Oct 26, 2011 at 1:40 AM, Lester Caine wrote:
>
>> Tommy Pham wrote:
>>
>>> I wonder ... The real question is what's the purpose of the DB? Is it
>>> for OLAP
>>> or OLTP? ;)
>>> As for dealing with DB having millions of rows, you're c
On Wed, Oct 26, 2011 at 1:40 AM, Lester Caine wrote:
> Tommy Pham wrote:
>
>> I wonder ... The real question is what's the purpose of the DB? Is it for
>> OLAP
>> or OLTP? ;)
>> As for dealing with DB having millions of rows, you're crossing over into
>> DBA area.
>>
>
> Many of my customers hav
Tommy Pham wrote:
I wonder ... The real question is what's the purpose of the DB? Is it for OLAP
or OLTP? ;)
As for dealing with DB having millions of rows, you're crossing over into DBA
area.
Many of my customers have coming up on 20 years of data available. There has
been a debate on trans
On Wed, Oct 26, 2011 at 12:52 AM, Lester Caine wrote:
> Tommy Pham wrote:
>
>> It turns out the issue was actually in the pagination... I'm reworking the
>>> > whole thing and stream lining it... But in the pagination that I found
>>> on
>>> > the internet it used a "SELECT COUNT(*) WHERE state
Tommy Pham wrote:
It turns out the issue was actually in the pagination... I'm reworking the
> whole thing and stream lining it... But in the pagination that I found on
> the internet it used a "SELECT COUNT(*) WHERE state='{$state}'"; and the
> COUNT was killing the time... Once that was remo
On Tue, Oct 25, 2011 at 7:06 PM, Jason Pruim wrote:
>
> It turns out the issue was actually in the pagination... I'm reworking the
> whole thing and stream lining it... But in the pagination that I found on
> the internet it used a "SELECT COUNT(*) WHERE state='{$state}'"; and the
> COUNT was kill
Jason Pruim
li...@pruimphotography.com
On Oct 25, 2011, at 10:10 PM, Jim Giner wrote:
>
> - Original Message - From: "Jason Pruim"
> To: "Jim Giner"
> Cc:
> Sent: Tuesday, October 25, 2011 10:06 PM
> Subject: Re: [PHP] Exporting large data f
- Original Message -
From: "Jason Pruim"
To: "Jim Giner"
Cc:
Sent: Tuesday, October 25, 2011 10:06 PM
Subject: Re: [PHP] Exporting large data from mysql to html using php
It turns out the issue was actually in the pagination... I'm reworking the
whole t
Jason Pruim
li...@pruimphotography.com
On Oct 25, 2011, at 9:58 PM, Jim Giner wrote:
> "David Robley" wrote in message
> news:49.50.34068.1b567...@pb1.pair.com...
>>
>> Consider running EXPLAIN on all your queries to see if there is something
>> Mysql thinks could be done to improve perform
Jason Pruim wrote:
>
> Jason Pruim
> li...@pruimphotography.com
>
>
>
> On Oct 25, 2011, at 10:51 AM, Jim Giner wrote:
>
>> I disagree. It's not about "tuning the queries", it is more about the
>> appl. design that currently thinks it SHOULD do such huge queries.
>>
>> My approach would be
"David Robley" wrote in message
news:49.50.34068.1b567...@pb1.pair.com...
>
> Consider running EXPLAIN on all your queries to see if there is something
> Mysql thinks could be done to improve performance.
>
Why do so many responders seem to think the problem here is in the
preparation of the qu
On 2011-10-25, at 6:56 PM, Jason Pruim wrote:
>
> Jason Pruim
> li...@pruimphotography.com
>
>
>
> On Oct 25, 2011, at 6:35 PM, Jim Giner wrote:
>
>> Again why even do a detail query? Nobody is going to examine pages and
>> pages and etc.
>> Do a summary qry if u just need a count -
Jason Pruim
li...@pruimphotography.com
On Oct 25, 2011, at 6:35 PM, Jim Giner wrote:
> Again why even do a detail query? Nobody is going to examine pages and
> pages and etc.
> Do a summary qry if u just need a count - no pagination there
> jg
The bosses wanted them to be able to page t
Again why even do a detail query? Nobody is going to examine pages and pages
and etc.
Do a summary qry if u just need a count - no pagination there
jg
On Oct 25, 2011, at 6:26 PM, Jason Pruim wrote:
>
> Jason Pruim
> li...@pruimphotography.com
>
>
>
> On Oct 25, 2011, at 10:51 AM, Jim
Jason Pruim
li...@pruimphotography.com
On Oct 25, 2011, at 10:51 AM, Jim Giner wrote:
> I disagree. It's not about "tuning the queries", it is more about the appl.
> design that currently thinks it SHOULD do such huge queries.
>
> My approach would be to prompt the user for filtering criter
I disagree. It's not about "tuning the queries", it is more about the appl.
design that currently thinks it SHOULD do such huge queries.
My approach would be to prompt the user for filtering criteria that
automatically would reduce the result set size. Although at this time I
believe the OP m
On Mon, Oct 24, 2011 at 7:50 PM, Jason Pruim wrote:
> I have about 89 million records in mysql... the initial load of the page
> takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL
> query's... But they just aren't going fast enough...
>
> What I would like to do, is pull t
On 2011-10-24, at 11:14 PM, Jason Pruim wrote:
>
> Jason Pruim
> li...@pruimphotography.com
>
>
>
> On Oct 24, 2011, at 9:20 PM, Bastien wrote:
>
>>
>>
>> On 2011-10-24, at 8:50 PM, Jason Pruim wrote:
>>
>>> Now that I've managed to list 3 separate programming languages and somewhat
>>
On 2011-10-24, at 11:26 PM, "Jim Giner" wrote:
> Yes - but - we're talking about a user-app that the OP is trying to provide
> 89M records to. Sure - "some" users might have need of looking at even as
> much as a million records IF they were researching something that needed it.
> But - for
Yes - but - we're talking about a user-app that the OP is trying to provide
89M records to. Sure - "some" users might have need of looking at even as
much as a million records IF they were researching something that needed it.
But - for the 'general' user of an app - I cannot see a need to be p
Jason Pruim
li...@pruimphotography.com
On Oct 24, 2011, at 9:20 PM, Bastien wrote:
>
>
> On 2011-10-24, at 8:50 PM, Jason Pruim wrote:
>
>> Now that I've managed to list 3 separate programming languages and somewhat
>> tie it back into php here's the question...
>>
>> I have about 89 mil
On Oct 24, 2011, at 10:44 PM, Jim Giner wrote:
> Why would any user need to have access to 89M records?
They don't need access to it to edit it... Just to be able to view it... Also,
it will be expanding in the future to include alot more data.
Jason Pruim
li...@pruimphotography.com
--
PH
On 2011-10-24, at 10:44 PM, "Jim Giner" wrote:
> Why would any user need to have access to 89M records?
>
>
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
History or audit trail data? I can think of lots. I know of an app wit
Why would any user need to have access to 89M records?
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
On 2011-10-24, at 8:50 PM, Jason Pruim wrote:
> Now that I've managed to list 3 separate programming languages and somewhat
> tie it back into php here's the question...
>
> I have about 89 million records in mysql... the initial load of the page
> takes 2 to 3 minutes, I am using pagination
2-3 minutes is long enough. I think your SELECT query and MySQL schema has
something that can be improved.
Sent from a handheld device.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
The php script language has no bearing on the output unless you have
characters In the php file itself.
We had some issue like this at work. They found a way using iconv to
to it but had to change because redhats iconv isn't updated. They do
something with saving the output to a utf8 encode
On Fri, 2009-03-27 at 17:40 +0800, Ai Leen wrote:
> Hi Everyone,
>
> I need to export data from database with UTF-8 encoding to an csv file. I am
> outputing html tables with the Content Type set to msexcel.
>
> The chinese texts came out as symbols. I tried
> using mb_convert_encoding the text
>
>Come on, Shirah. GOYA and RTFM! ;-P
>
>http://php.net/ifx_fieldtypes
>
>Drop the $i parameter below:
>
>
> [snip!]
> > $query = ifx_query ($sql, $connect_id);
> [snip!]
> > $head[] = ifx_fieldtypes($query, $i);
> [snip!]
Son of a Bisquick! I'll chalk that up to looking at the Ma
On Wed, Jun 18, 2008 at 5:53 PM, Dan Shirah <[EMAIL PROTECTED]> wrote:
> Hello all,
>
[snip!]
>
> Well, when I execute the page I get the popup to save/open the output as an
> Excel file. When I open it, instead of getting data, it returns an error
> message of: PHP Warning: Wrong parameter count
Ave,
I was fiddling my way around with foreach and found a solution to my
problem. Here's my code:
// define the array
$tChoice = array(
"lodispo_osma" => "ATL",
"lodispo_osmh" => "HOU",
"lodispo_osmj" => "JAX",
"lodispo_osmt" => "TPA",
"lodispo" => "VB
On 6/3/05, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
> To export it exactly as displayed (like when you print to a virtual printer
> to generate a PDF) might be tricky, but you can definitely create Excel and I
> believe Word files without even having Excel or Word installed. If you DO
> ha
To export it exactly as displayed (like when you print to a virtual printer to
generate a PDF) might be tricky, but you can definitely create Excel and I
believe Word files without even having Excel or Word installed. If you DO
have Excel or Word installed on your server, then you can always u
nt-Type is referred to as a mime header (8-}), I often
see something similar in email.
HTH,
Warren Vail
(415) 667-0240
SF211-07-434
-Original Message-
From: Philip Thompson [mailto:[EMAIL PROTECTED]
Sent: Wednesday, October 13, 2004 6:18 AM
To: Vail, Warren
Cc: [EMAIL PROTECTED]
Subjec
In an attempt to make amends, have you tried PEAR's
Spreadsheet_Excel_Writer?
http://pear.php.net/package/Spreadsheet_Excel_Writer
It certainly works, we used it last week (although I have no *personal*
experience of it)
Cheers
Chris
Chris Dowell wrote:
sorry
[skulks off into dark corner to cr
sorry
[skulks off into dark corner to cry]
Philip Thompson wrote:
Chris, as you may have (or may have not) noticed, I did that. Look at my
original email. I wasn't sure if there was more to it than that.
~Philip
On Oct 13, 2004, at 8:34 AM, Chris Dowell wrote:
Header("Content-type: application/m
Chris, as you may have (or may have not) noticed, I did that. Look at
my original email. I wasn't sure if there was more to it than that.
~Philip
On Oct 13, 2004, at 8:34 AM, Chris Dowell wrote:
Header("Content-type: application/ms-excel");
or whatever the content type should be (on IE you can pr
Header("Content-type: application/ms-excel");
or whatever the content type should be (on IE you can probably get away
with application/octet-stream as it bases its decisions of file
extensions for a lot of things).
HTH
Chris
Philip Thompson wrote:
How exactly would I change the mime headers?
~Ph
How exactly would I change the mime headers?
~Philip
On Oct 12, 2004, at 5:07 PM, Vail, Warren wrote:
Have you tried changing your file name to project.htm but continue
issuing
the mime headers for excel?
Warren Vail
-Original Message-
From: Philip Thompson [mailto:[EMAIL PROTECTED]
Sent:
Have you tried changing your file name to project.htm but continue issuing
the mime headers for excel?
Warren Vail
-Original Message-
From: Philip Thompson [mailto:[EMAIL PROTECTED]
Sent: Tuesday, October 12, 2004 2:21 PM
To: [EMAIL PROTECTED]
Subject: [PHP] Exporting HTML to Excel
Hi
Philip, have you checked the php freaks website? I do recall seeing a script
in their tutorials library that claimed to create Excel files.
http://www.phpfreaks.com
Just had a search and came up with:
Cell 1
Cell 2
";
?>
HTH
Graham
> -Original Message-
> From: Philip Thompson [mailt
[snip]
"Jay Blanchard" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
>
> What, exactly, do you mean? A CSV file that resembles a spreadsheet? A
> doc with CSV?
The term "CSV" is very common; it means "Comma Separated Values". It is
the
format that originally was (and is) used by th
"Jay Blanchard" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
>
> What, exactly, do you mean? A CSV file that resembles a spreadsheet? A
> doc with CSV?
The term "CSV" is very common; it means "Comma Separated Values". It is the
format that originally was (and is) used by the Basic
"Harlequin" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
> I'm using MySQL Jay
>
> I can present the results in a table easy enough and if I replace the TD
> TAGs with commas etc. I get a screen output that resembles a CSV file but
> need to go that one step further and don't know ho
[snip]
I'm using MySQL Jay
I can present the results in a table easy enough and if I replace the TD
TAGs with commas etc. I get a screen output that resembles a CSV file
but
need to go that one step further and don't know how...
[/snip]
Use CSV (without the table thing) and place Excel header def
I'm using MySQL Dan
I can present the results in a table easy enough and if I replace the TD
TAGs with commas etc. I get a screen output that resembles a CSV file but
need to go that one step further and don't know how...
--
-
Michael Mason
Arras People
www.arraspe
I'm using MySQL Jay
I can present the results in a table easy enough and if I replace the TD
TAGs with commas etc. I get a screen output that resembles a CSV file but
need to go that one step further and don't know how...
--
-
Michael Mason
Arras People
www.arraspe
Hi,
Depending on which database you're using, there is the UNLOAD
function. Do a google for it, should find the answers.
> I've seen many different posts on this subject and try as I might I can't
> get my data to output to CSV for a user to save.
>
> I can output to a table, I can outp
[snip]
I've seen many different posts on this subject and try as I might I
can't
get my data to output to CSV for a user to save.
I can output to a table, I can output to screen in CSV format but that
just
means massing about with copy and paste.
Has anyone got any suggestions...?
[/snip]
What,
I knew that ;)
Some times it takes another set of eyes...
Thanks
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
I did. ;)
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
On Saturday 14 August 2004 00:31, Vern wrote:
> I got it. In case anyone is interested. I changed the code so that the
> field value was a became the variable $name and it worked.
>
> $name = $rsITEMS->Fields('item_id');
> fwrite($fp, "$name\n");
But did you take the fopen() out of the while-loo
Hi Vern,
Vern wrote:
Sorry the previous post got sent prematurely.
this one, too. #
It does work, however, the problem is in the
fwrite($fp, "$rsITEMS->Fields('item_id')\n");
I KNOW, this was the 2nd change in my code-sniplet ;o)
what gets written to the file is exactly "$rsITEMS->Field
I got it. In case anyone is interested. I changed the code so that the field
value was a became the variable $name and it worked.
$name = $rsITEMS->Fields('item_id');
fwrite($fp, "$name\n");
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Sorry the previous post got sent prematurely.
It does work, however, the problem is in the
fwrite($fp, "$rsITEMS->Fields('item_id')\n");
what gets written to the file is exactly "$rsITEMS->Fields('item_id')" for
as many records are return when I want the value to be inserted in the fi
Hi Vern,
Vern wrote:
I'm trying to write data to a file from a Postgres database using the
following code
while (!$rsITEMS->EOF) {
$fp = fopen("./dump.sql", "w");
fwrite($fp, "$rsITEMS->Fields('item_id')\n");
$rsITEMS->MoveNext();
> [shortened]
I wouldn't do, if I were you.
It
k it is an encryption problem.
> -Original Message-
> From: Anas Mughal [mailto:[EMAIL PROTECTED]
> Sent: Thursday, December 04, 2003 4:27 PM
> To: Geoffrey Thompson
> Subject: Re: [PHP] Exporting Data as CSV - IE6/HTTPS Problem?
>
> HTTPS is encrypting the URL -- includin
Damn exploder, use session_cache_limiter('private_no_expire'); before
session_start()
Ben C. wrote:
I am using the code below to export my data into an excel file. The code is
located in a password protected area which is checked against saved session
variables. However when I put session_start
>I use phpMyAdmin which enables me to take dump of
>mySQL Table Data into Comma Seperated Values file
>(.csv)
>
>Now, i have to create such a program that accomplishes
>this, without using phpMyAdmin. Can someone guide me
>to this procedure..
Since PHP has a fget_csv function (or something like t
> hey guys..
>
> does anyone have any ideas on how to export information from a mysql
> database to microsoft word, excel, access, note pad or any other such
> application?
>
There is a class on http://phpclasses.upperdesign.com/
That allows you to create a excel file from php. Should be fairly
I always stick with PHP for web apps, but I am in a similar situation, I
have to write a script that generates statistics and item reports from a
database... The client wanted them in Excel format... I could have done
CSV, but for the sake of aesthetics, I decided to go with a native Excel
export
> I need something that will do a direct transfer to word or
> excel or the others.. not something via a cvs method..
Excel loads comma separated (CSV - CVS is something else entirely)
files happily.
Jason
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTEC
Tuesday, November 20, 2001 3:39 AM
Subject: Re: [PHP] exporting
> You can use something like this from MySQL:
>
> SELECT * INTO OUTFILE 'data.txt'
> FIELDS TERMINATED BY ','
> FROM ...;
>
> This will give you a comma delimited file
80 matches
Mail list logo