J S wrote:
| Hi,
|
| I need a bit of help with a mySQL query. I have a list of users in a
| text file called 'users':
|
| u655354
| u687994
| u696974
| u728141
| ..
| ..
|
| and I need to check the last date each user logged on to the proxy in
| the last 3 months.
|
| In my database,
Hi,
I need a bit of help with a mySQL query. I have a list of users in a text
file called 'users':
u655354
u687994
u696974
u728141
..
..
and I need to check the last date each user logged on to the proxy in the
last 3 months.
In my database, there is a table for the users:
mysql> desc user_tabl
Just reposting in case anyone missed it! Or have I asked a dumb question?!
Hi,
I've run a safe recover on a table after I had to recreate the index file:
# myisamchk --tmpdir=/proxydb/mysql/tmp --safe-recover internet_usage
- recovering (with keycache) MyISAM-table 'internet_usage'
Data records: 29
Hi,
I've run a safe recover on a table after I had to recreate the index file:
# myisamchk --tmpdir=/proxydb/mysql/tmp --safe-recover internet_usage
- recovering (with keycache) MyISAM-table 'internet_usage'
Data records: 290804216
Data records: 519541696
#
After that I ran a query on the table, bu
That failed with an err27 as well. How do I set the debug flag on myisamchk
?
--debug=/tmp/debug doesn't work.
JS.
Hi.
Sounds OK, but don't forget to make a backup.
"J S" <[EMAIL PROTECTED]> wrote:
> Thanks for your reply Gleb. I tried running with --safe-recov
After I ran some database updates last night, it seems to be working OK
again today. Not sure what happened! Thanks anyway.
JS.
Hi.
Uncommon situation.
Please tell me, what OS do you use, MySQL version and how was it compiled?
Is there any clues at the end of .err file?
"J S" <[EM
Hi,
I have this query which used to work really fast but now it just seems to
lock up the tables. I tried running it with EXPLAIN but it just hangs at the
command prompt. I also tried adding the USE INDEX directives but still no
luck. I've run myisamchk -m on both tables involved and no errors w
-sort-index --analyze internet_usage
Sound OK?
JS.
Hi.
Usually myisamchk doesn't have the suid bit set, and it seems to be ran as
root.
See
http://dev.mysql.com/doc/mysql/en/Table_maintenance.html
Especially
http://dev.mysql.com/doc/mysql/en/myisamchk_memory.html
"J S" <[
Hi,
I've run myisampack to compress a table and am now trying to recreate the
index using myisamchk (as root) but I get an error 27.
# myisamchk -rq --sort-index --analyze --tmpdir=/proxydb/mysql/tmp
internet_usage.MYI
- check record delete-chain
- recovering (with sort) MyISAM-table 'internet_
Hi,
I've run myisampack to compress a table and am now trying to recreate the
index using myisamchk (as root) but I get an error 27.
# myisamchk -rq --sort-index --analyze --tmpdir=/proxydb/mysql/tmp
internet_usage.MYI
- check record delete-chain
- recovering (with sort) MyISAM-table 'internet_
Hi,
I've run myisampack to compress a table and am now trying to recreate the
index using myisamchk (as root) but I get an error 27.
# myisamchk -rq --sort-index --analyze --tmpdir=/proxydb/mysql/tmp
internet_usage.MYI
- check record delete-chain
- recovering (with sort) MyISAM-table 'internet_u
> >
> > > I've just accidentally run myisampack (myisampack Ver 1.22) on a
table
> >which
> > > I still need to update. Is there anyway I can unpack it?
> >
> >myisamchk --unpack .MYI
> >
> >http://dev.mysql.com/doc/mysql/en/myisamchk_repair_options.html
> >
> > Alec
> >
>
> Thanks Alec,
>
> I've just accidentally run myisampack (myisampack Ver 1.22) on a table
which
> I still need to update. Is there anyway I can unpack it?
myisamchk --unpack .MYI
http://dev.mysql.com/doc/mysql/en/myisamchk_repair_options.html
Alec
Thanks Alec,
I haven't actually run myisamchk -rq --sort-ind
Hi,
I've just accidentally run myisampack (myisampack Ver 1.22) on a table which
I still need to update. Is there anyway I can unpack it?
Thanks,
JS.
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Are there any variables here I can tune to try and increase the the query?
mysql> show variables like '%buff%';
+-+---+
| Variable_name | Value |
+-+---+
| bdb_log_buffer_size | 262144|
| bulk_insert_buffer_size |
How fast do your disks rotate? What kind of disk controller are you using
(ATA, SATA, SCSI, ...)? Are they in a RAID array or not. If they are what
RAID scheme are you using?
I had to ask about this one because I'm not too sure about these sort of
things. Apparently the box is using ESS disks on
cents, too!
Shawn Green
Database Administrator
Unimin Corporation - Spruce Pine
"J S" <[EMAIL PROTECTED]> wrote on 09/09/2004 04:48:47 AM:
> Hi Shawn,
>
> I'm working at the moment on the new database! I had a quick question
for
> you. Here's my table structure:
&g
Hi,
I want to copy data from one table to another. However I need to do this
carefully because I haven't got much of space left. I was thinking of maybe
selecting data from every 10 rows of the old table, inserting it into
the new table, then deleting those rows from the old table.
Could so
Sorry me again! I haven't had any success with this thread :( But could
someone at least tell me the best way to debug this please? Are there any
options I can provide to increase the logging detail?
Thanks,
JS.
Does anyone know what goes on (system wise) during the CREATE INDEX ? I can
see it'
Does anyone know what goes on (system wise) during the CREATE INDEX ? I can
see it's copying the table file, what then though?
Hi, I'm stuck on this really badly. Can anyone help me out please?
Thanks,
JS.
I tried running the "create index" again using --verbose but this is all I
got:
ERROR 3 a
Hi, I'm stuck on this really badly. Can anyone help me out please?
Thanks,
JS.
I tried running the "create index" again using --verbose but this is all I
got:
ERROR 3 at line 1: Error writing file '/proxydb/mysql/tmp/STLd39Et'
(Errcode: 27)
--
CREATE INDEX timeindex ON internet_usage
I tried running the "create index" again using --verbose but this is all I
got:
ERROR 3 at line 1: Error writing file '/proxydb/mysql/tmp/STLd39Et'
(Errcode: 27)
--
CREATE INDEX timeindex ON internet_usage (uid,time)
--
Bye
Can anyone tell me the specifics of what happen
tem.
This should be a useful thread for you
http://lists.mysql.com/mysql/167953
Suggests using ALTER TABLE MAX_ROWS=
On Fri, 23 Jul 2004 20:20:24 +, "J S" <[EMAIL PROTECTED]> wrote :
> Hi,
>
> I'm trying to run the following SQL (on
> mysql-max-4.0.20-ibm-aix4
Hi,
I'm trying to run the following SQL (on
mysql-max-4.0.20-ibm-aix4.3.3.0-powerpc):
CREATE INDEX timeindex ON internet_usage (time);
CREATE INDEX urlindex ON internet_usage (urlid);
but keep running into the following error:
Error writing file '/proxydb/mysql/tmp/STeR39Er' (Errcode: 27)
$ perro
essage-
From: J S
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: 7/16/04 7:34 AM
Subject: RE: ERROR 3 at line 1: Error writing file
'/proxydb/mysql/tmp/STTbtrac' (Errcode: 27
>Can you explicitly create temporary tables?
Yes, I have full priviledges. Do you mean I could copy the
internet_us
Can you explicitly create temporary tables?
Yes, I have full priviledges. Do you mean I could copy the internet_usage
table to a temp table with new keys?
JS.
-Original Message-
From: J S
To: [EMAIL PROTECTED]
Sent: 7/16/04 3:54 AM
Subject: RE: ERROR 3 at line 1: Error writing file
27;s tricky isn't it?!
JS.
-----Original Message-
From: J S
To: [EMAIL PROTECTED]
Sent: 7/15/04 2:29 PM
Subject: ERROR 3 at line 1: Error writing file
'/proxydb/mysql/tmp/STTbtrac'
(Errcode: 27
Hi,
I'm trying to create an index using the command:
CREATE INDEX urlindex ON
27;s tricky isn't it?!
JS.
-----Original Message-
From: J S
To: [EMAIL PROTECTED]
Sent: 7/15/04 2:29 PM
Subject: ERROR 3 at line 1: Error writing file
'/proxydb/mysql/tmp/STTbtrac'
(Errcode: 27
Hi,
I'm trying to create an index using the command:
CREATE INDEX urlindex ON
Hi,
I'm trying to create an index using the command:
CREATE INDEX urlindex ON internet_usage (urlid)
| Field| Type | Null | Key | Default | Extra |
| urlid| int(10) unsigned | | | 0 | |
But I keep getting the following error:
ERROR 3 at line
In the last episode (Jul 14), J S said:
> Thanks for your reply. Can I just check my syntax is right for
> creating the key?
>
> CREATE INDEX urlindex ON internet_usage (time);
>
> I ran this before but got an error 27 (eventhough the filesize ulimit
> was set to unlimited).
episode (Jul 14), J S said:
> I want to find the earliest and latest times in the time column of my
> table internet_usage:
>
> +--+--+--+-+-+---+
> | Field| Type | Null | Key |
That was it! Thanks so much.
JS
JS:
When I use DATE_FORMAT(), if I have a space character between DATE_FORMAT
and the ( I get an error. Try writing it like:
SELECT DATE_FORMAT( "20040601123456", '%Y-%m-%d' );
MySQL v4.1.2-alpha-standard
Mac OS X
Wes
On Jul 14, 2004, at
Hi,
I want to find the earliest and latest times in the time column of my table
internet_usage:
+--+--+--+-+-+---+
| Field| Type | Null | Key | Default | Extra |
+--+--+--+-+---
date_added'];
$date = date("M d, Y", strtotime($date_added));
echo $date;
H and i do work in PHP as well, I believe, if you want to add the hour and
seconds. Check out the 'date' function in PHP for info.
- Eve
In MySQL you can do that with: DATE_FORMAT
DATE
Hi,
What SQL function do I need to convert timestamp 200406011403 into
"2004-06-01 14:03" ?
Thanks,
JS.
_
Stay in touch with absent friends - get MSN Messenger
http://www.msn.co.uk/messenger
--
MySQL General Mailing List
For list a
Hi,
I have the following tables in my database for proxy logs. I have broken
down the URL for each line into the following 4 tables:
url_schemes e.g. http
url_serverse.g. www.google.com
url_paths e.g. /search
url_queriese.g q="searchstring"
To get a list of the websites a user has
Hi,
The code in my function below (insert_internet_usage) has suddenly
dramatically slowed down and I don't understand why. The logs I'm parsing
are the same size as before, but what used to take 15 minutes has shot up to
about 4 hours. It's almost as if the indexes had stopped working.
I wonder
On Wednesday 30 June 2004 12:10 pm, J S wrote:
> Hi,
>
> I would like to switch off the logging that creates the large files
below
> (in the mysql data directory)
> Could someone tell me how to do this please?
take out log-bin from my.cnf. But you better not have replication
Hi,
I would like to switch off the logging that creates the large files below
(in the mysql data directory)
Could someone tell me how to do this please?
Thanks,
js.
# ls -l
total 5418648
-rw-rw 1 mysqlmysql 25088 Jun 28 14:27 ib_arch_log_00
-rw-rw 1 mysqlmysql
e wait.
Also personally I would change the ip from a bigint to an int, if that is
really an IP address like it seems.
Just my opinion. No matter what I would make a backup of your data before
making any changes.
Donny
-----Original Message-
From: J S [mailto:[EMAIL PROTECTED]
Sent: Friday, J
Even the ALTER gave me an error!
mysql> alter table internet_usage change uid uid char(10);
ERROR 1030: Got error 127 from table handler
mysql>
J S wrote:
Thanks for your reply. I had to change the IP column to bigint because
mysql was inserting the wrong value when it was just int.
Are yo
Figured it out! Took a gamble and run the below command!
SELECT iu.time, INET_NTOA(iu.ip), concat(usc.scheme,"://",us.server,up.path)
FROM url_visit uv
INNER JOIN internet_usage iu
ON iu.urlid=uv.urlid
INNER JOIN url_servers us
ON us.id=uv.url_server_ID
INNER JOIN url_paths up
Hi Shawn,
I wondered if you might be able to help me with an SQL query.
I want to list all the internet sites I've surfed in my database.
Here's a query that matches the url with a urlid:
SELECT concat(usc.scheme,"://",us.server,up.path)
FROM url_visit uv
INNER JOIN url_servers us
ON us.id=u
which means it has varchar's,
text,
> >or
> >blobs the limit is 2 gigs. If you are using a fixed table which uses
> chars
> >only, then there is no limit that I have seen.
> >
> >Donny
> >
> > > -Original Message-
> > > From: J
'uid' is a mix of chars and ints, e.g u752352.
Yes, if you are using a dynamic table which means it has varchar's, text,
or
blobs the limit is 2 gigs. If you are using a fixed table which uses chars
only, then there is no limit that I have seen.
Donny
> -Original Message--
Could this problem be due to the size of my tables? Is there a limit on how
big the table can be?
I'm using mysql-standard-4.0.20.
-rw-rw 1 mysqlmysql2147483647 Jun 25 01:49 internet_usage.MYD
-rw-rw 1 mysqlmysql622724096 Jun 25 01:49 internet_usage.MYI
Hi,
I got an er
Hi,
I got an error 27.
DBD::mysql::st execute failed: Got error 27 from table handler at
/home/u752359/logfile.pl line 144, line 3079464.
The FAQs say:
Check whether you have hit 2 Gb limit.
If that is not the case, shutdown MySQL server and repair a table with
(my)isamchk.
How do I check if I h
anks!
Shawn Green
Database Administrator
Unimin Corporation - Spruce Pine
"J S"
<[EMAIL PROTECTED]To: [EMAIL PROTECTED]
com> cc:
[EMAIL PROTECTED]
2 | tcp|
| 3 | - |
| 4 | ftp|
| 5 | https |
+++
5 rows in set (0.00 sec)
J S,
Check to see if the url_scheme_ID part of a unique constraint/key or the
Primary Key? If it isn't, you will get dupes even if you use INSERT IGNORE
unless you pre-screen your INSERTs some other
uce Pine
"J S"
<[EMAIL PROTECTED]To: [EMAIL PROTECTED]
com> cc:
[EMAIL PROTECTED]
Fax to:
06/22/2004 07
ata require (including indices)?
Yours,
Shawn Green
Database Administrator
Unimin Corporation - Spruce Pine
"J S"
<[EMAIL PROTECTED]To: [EMAIL PROTECTED]
com>
I think I fixed it!
INSERT IGNORE url_visit (url_server_ID, url_path_ID, querystring,
category)
SELECT DISTINCT us.ID, up.ID, if(bt.path_split >0, SUBSTRING(bt.url,path),
NULL),
bt.category
FROM bulk_table bt
INNER JOIN url_servers us
ON us.server = bt.server
INNER JOIN url_paths up
on
Did you mean there to be duplicates in the url_visits? Do I need to use
IGNORE in the following SQL?
INSERT url_visit (url_server_ID, url_path_ID, querystring,
category)
SELECT us.ID, up.ID, if(bt.path_split >0, SUBSTRING(bt.url,path),
NULL),
bt.category
FROM bulk_table bt
INNER JOIN url_servers
Shawn,
Thanks for your reply below. I found it extremely useful. I have followed
your instructions and got good results up to the url_visits table.
I have a perl script to parse the values out of the log. The log has
3,770,246 lines and is gzipped. I then applied your SQL statements with the
fo
t machines I guess I
would have to see more data to understand your problem better.
Yours,
Shawn Green
Database Administrator
Unimin Corporation - Spruce Pine
"J S"
<[EMAIL PROTECTED]To: [EMAIL PROTECTED]
Shawn,
Thanks for helping on this. I really appreciate it.
No problem!!
Please post the structures of your "big_table" and your "url_table"
(whatever you called them) and I will help you to rewrite step 4 to count
how many times a URL appears in the "big_table".
mysql> desc internet_usage;
+--
Mos forgot to populate the url_id column in your user table. I would use
his same process but re-arrange it like this:
1) create table BIG_TABLE
2) load data infile
3) create table URL_TABLE (
url_id bigint not null auto_increment,
url varchar(25) not null primary key,
c
Hi,
I have a perl script which runs a bulk insert. When I run an insert with
about 100,000 lines it keels over with the following message:
DBD::mysql::st execute failed: Out of memory (Needed 6100848 bytes) at
./parse.pl line 227, <> line 15.
There is 8GB of memory on the box so I'm sure th
with
source '/tmp/updates.txt';
Michael
J S wrote:
Hi,
I'm trying to load a file containing a set of updates e.g.
UPDATE url_table SET hits=3 WHERE
url="http://cache.unicast.com/java60/HTMLad_utils/ad2applet.js";;
UPDATE url_table SET hits=2 WHERE
url="http
Hi,
I'm trying to load a file containing a set of updates e.g.
UPDATE url_table SET hits=3 WHERE
url="http://cache.unicast.com/java60/HTMLad_utils/ad2applet.js";;
UPDATE url_table SET hits=2 WHERE
url="http://abclocal.go.com/images/wabc/2003/wabc_images_whatsonseven.gif";;
UPDATE url_table SET hi
Hi,
I need some help please! I have 60GB of proxy logs to parse and load into
a mysql database. I've written a parsing script but I'm stuck now on how
to load the data in.
I have a database called PROXY_LOG with 2 tables:
USER_TABLE
user_id date_time url_id size
and
URL_TABLE
url_id url
> Hi,
>
> I need some help please! I have 60GB of proxy logs to parse and load
> into a mysql database. I've written a parsing script but I'm stuck now
> on how to load the data in.
>
> I have a database called PROXY_LOG with 2 tables:
>
> USER_TABLE
> user_id date_time url_id size
>
> and
Hi,
I need some help please! I have 60GB of proxy logs to parse and load into a
mysql database. I've written a parsing script but I'm stuck now on how to
load the data in.
I have a database called PROXY_LOG with 2 tables:
USER_TABLE
user_id date_time url_id size
and
URL_TABLE
url_id url
63 matches
Mail list logo