PROPOSED EARTHQUAKE FORECASTING COMPUTER PROGRAM DEVELOPMENT EFFORT Posted July 11, 2005 My main earthquake forecasting Web page is: http://www.freewebz.com/eq-forecasting/Data.html
Newsgroup Readers: If you circulate copies of this report to groups of computer programmers at different universities etc. around the world then they might find the subject matter to be interesting. The information in this report represents expressions of personal opinion. THE GOAL OF THIS REPORT This is part of an effort to get some idea regarding how many computer programmers and other researchers around the world might be interested in participating in a project aimed at developing life saving earthquake forecasting computer programs. That effort is not presently underway. And I don't know when or if it will get started. I am simply attempting to determine if other people believe that large numbers of volunteers would be interested in working on such a project or if there would instead be little interest in it. That information would be helpful for developing a plan for establishing a Web site where the project would be centered. Personnel running the following Web site have volunteered to make their site available for such an effort. But nothing has gotten underway so far. http://www.ictwhoiswho.net/comprend/index.cfm If quite a few people were interested in such a Web site based computer program development effort then after it got started work would undoubtedly progress quite rapidly. If only a few were interested then it might never get started. If you would like to express an opinion on the likelihood of people being interested in the idea then you can try posting a note in response to this one. The sci.geo.earthquakes or comp.lang.misc newsgroups might be appropriate if you wish to post to just one newsgroup. Of you can try contacting me by e-mail. THE CORE OF THE PRESENT EARTHQUAKE FORECASTING PROGRAM In connection with an earthquake forecasting effort which has been underway for the past 15 years I believe I have been able to crack the "Earthquake Code." That means making crucially important discoveries regarding how earthquakes are being triggered. Two of them which are discussed on the following Web page are called the "Gravity Point" and "Earthquake Triggering Symmetry." http://www.freewebz.com/eq-forecasting/90-05.html The earthquake triggering and forecasting theories and data on that Web page were discussed on my behalf by one of my research colleagues in the People's Republic of China at a disaster mitigation conference in that country in December of 2003. Governments and disaster mitigation groups around the world were told about the Web site earlier this year. And my Web site visitor counter indicates that some 100 to 200 people around the world are presently downloading information from the site each day. My earthquake forecasting computer programs use those Gravity Point and Earthquake Triggering Symmetry discoveries and others to compare electromagnetic energy field fluctuation type signals (EM signals) with more than 30,000 earthquakes which occurred since the beginning of 1990. Some 100 to 200 signals detected during a 3 month period of time are involved. For some as yet unknown reason they are often highly selective for earthquakes which are likely to occur near populated areas, making them unusually valuable. The earthquake which is the best match with all of those signals is rated # 1. The worst match would have a rating number greater than 30,000. A listing of more than 100 of the best matches is then posted perhaps once a week to the following Web page: http://www.freewebz.com/eq-forecasting/Data.html For a recent example of how well that approach to forecasting earthquakes can work, data displayed on that Web page on June 27, 2005 gave the following earthquake a # 3 rating (possible rating range: 1 to 30,000+): 2005/01/11 19:19:48 11.40N 86.51W 40.7 5.0 Near the Coast of Nicaragua (U.S. National Earthquake Information Service data) And less than a week later on July 2, 2005 the following powerful and strongly felt earthquake occurred: 2005/07/02 02:16:46 11.18N 86.40W 45.5 6.7 Near the Coast of Nicaragua 11.40N and 86.51W versus 11.18N and 86.40W Pretty good accuracy for a forecasting program! That earthquake was reportedly strongly felt in Managua. Had it occurred directly beneath the city and near the surface then I expect that it would have been devastating. WHAT NEEDS TO BE DONE Basically, more sophisticated data processing and data display computer program subroutines need to be developed. They could be built on my already existing computer programs and data. People would develop new subroutines, give them a try, and see if they did a better job of determining or displaying where an earthquake might be about to occur. The subroutines could be stored at the proposed Web site. Ones that were especially helpful could be merged into the main program one version of which might run as a CGI program at the Web site. Other versions of the programs would be downloadable for free for use on personal computers. The basic form of the data processing routine for my existing program and probably many other earthquake forecasting programs might be expressed in the following manner: Prob = aA + bB + cC + dD + eE + ... "Prob" is the probability that an earthquake of a given magnitude will occur at a specific latitude, longitude, depth, and time. A, B, C, D, and E . are things such as: A - the gradual buildup of strain in a fault zone due to the movement of the Earth's tectonic plates relative to one another B - temporary strain added to the fault zone by bending, stretching, and compression forces related to the Solid Earth Tide. (The ground shifts a little in response to the sun and moon gravities just as ocean water does - hence the Solid Earth Tide) C - temporary strain added to the fault zone by the weight of ocean water shifting from one location to another in connection with ocean tides a, b, c, d, and e .. are "weight" factors which specify how important A, B, and C etc. are at different points in time. "A," the gradual buildup of strain in a fault zone related to tectonic plate movement etc. is undoubtedly the most important factor and perhaps the only one which scientists around the world are in agreement on. It can probably be determined with a certain amount of accuracy for some fault zones at the present time, but not too many. With my present computer programs I do not use actual values for "A" as they would be impossible to determine. Instead when a strong EM signal is detected I simply assume that a fault zone somewhere has stored enough strain energy that it is about ready to fracture. And an effort is made to determine where it is located. With each probability calculation my present computer programs do about 30 separate comparisons between each of the 100 to 200 EM signals and the more than 30,000 earthquakes in my database file. The comparisons involve things such as the positions of the sun and the moon in the sky and the locations of ocean and Solid Earth Tide crests and troughs around the world when the EM signal was detected and when the earthquake occurred. For an example of one possible and relatively easy computer program improvement, an effort could be made to see if factoring in earthquake fault zone orientation - north and south versus east and west - improved the probability calculations. Another improvement would involve determining the importance of the latitudes of the Gravity Points and the sublunar points. At the present time only their longitudes are used in my calculations. (The sublunar point is the location on the surface of the Earth which a line drawn between the center of the Earth and the center of the moon would pass through.) An earthquake forecasting group at Madras University in India has already developed some advanced earthquake location determination routines which it appears might be helpful to this effort. Routines developed for use at the following Web sites might also be helpful. http://pasadena.wr.usgs.gov/step/ http://www-aig.jpl.nasa.gov/public/dus/quakesim/ THE PROGRAMMING LANGUAGE IN PRESENT USE The original programs were written in a number of languages including Basic. The main program is presently written in Perl. That language was chosen because it is fast and powerful, the compiler can be downloaded for free by anyone, and because it looks like it is getting sufficient support that calculations can be trusted and it will be around for a while. There is another feature of that language which I am guessing many other programming languages probably presently offer though that was not the case in the past. That is the ability to make changes to the program code itself at the same time that the program is running. I myself do that in the following manner: Ordinarily the main program "P1" starts running and performs a group of calculations. That takes about 5 minutes. It then waits for a keyboard instruction telling it how the output data should be displayed. As with probably any program, once it is running in the normal mode no changes can be made to the program itself. However when it is run in the following experimental mode that rule does not apply. A short program I will call "P2" starts running and immediately uses a "do" statement to get the regular program P1 to compile and start running. P1 does the original calculations like normal. But instead of waiting for the operator to enter a data display command it exits and P2 becomes active again. A display instruction is entered and P1 is told to recompile and start running again. Instead of doing all of the calculations from scratch it jumps strait to the display routine and uses the entered command to begin printing the output data. The data generated when it originally did the calculations are still in memory. The advantage here is that at any time, program P1 can be called into a text editor, modified as desired, and then saved. A new display subroutine can be added to it or an existing routine can be modified while the previously generated data are still active in the computer. Then when P1 is directed by P2 to recompile (that takes about a second) and run, the new subroutine is included just as if it were in the original program. If the compiler encounters a programming error it terminates and returns control to program P2. A correction can then be made to the new program code and the sequence repeated. No data are lost because of the error. And you don't have to wait for 5 minutes while the program recalculates everything. Data processing and display routines could be written with other programming languages besides Perl. If done with sufficient care more than one language could be used at the same time. The different language routines would simply be linked with one another. A LIFE SAVING EFFORT It appears to me (sadly) that few governments have a very good sense of direction with regard to the development of life saving earthquake forecasting programs. The work is usually undertaken by independent research groups at various government agencies or universities. They go their own ways, uninterested in or unable to work with one another to effectively forecast deadly earthquakes. They often claim that if they share information then they could lose their patent rights and potential profits etc. And that could actually happen. But isn't saving tens of thousands of lives more important than simply making a meager profit (which no one that I am aware of is presently able to do with their forecasting programs anyway)? I myself presently own 3 U.S. copyrights related to my forecasting technology. But I have been letting interested parties around the world use it for free. See: http://www.freewebz.com/eq-forecasting/301.html One country where forecasting program information is shared to some extent is the People's Republic of China were some 10,000 people reportedly work full-time in a state sponsored earthquake forecasting program. They are also supported by a small army of volunteer workers. But even there the forecasting efforts of different groups are not effectively coordinated very often from what I can see. And advanced forecasting technology being developed in other countries is frequently ignored. A listing of some other earthquake forecasting programs around the world can be found on the following Web page: http://www.freewebz.com/eq-forecasting/141.html A book discussing one of the forecasting programs listed on that Web page is scheduled for release some time in late 2005. See: http://www.sentientpublications.com/catalog/earthquakes.php I myself assisted with that effort by providing the author with some free technical information and book content advice. The proposed Internet Web site effort to develop an effective earthquake forecasting program discussed in this present report could not be ignored by governments around the world. Once they saw how well the programs worked then they would be forced to begin using them to predict their own earthquakes. United Nations personnel appeared to like this concept when I formally proposed it to them in July of 2004. http://www.unisdr.org/wcdr-dialogue/t3-dialogue.htm#34 And they discussed it repeatedly in their summary reports of the ideas proposed during that Internet Web site based conference. But no governments or disaster mitigation groups expressed any interest in developing the concept. Since the starting point for the proposed forecasting program development efforts would be the computer programs that I already have running, success would be guaranteed. So, the important question would then be, "How many computer programmers and other researchers would be interested in helping with such an effort?" This type of work is quite interesting and exciting when you become actively involved with it. You can generate a forecast. And then when the earthquake occurs where you expected you can get a pretty good shock. If the earthquake is also destructive the experience can be rather frightening. The importance of the work should be obvious. More than a quarter of million people reportedly perished in connection with the December 26, 2004 earthquake generated tsunami (tidal wave) in the Indian Ocean. My present forecasting computer programs did not become operational until several weeks after it occurred. But when I ran my EM signal data from around that time through the programs the results indicated to me that the earthquake could have been predicted. Why devote valuable free time to developing computer programs which do ordinary things when that time could be devoted to developing programs which might eventually help save tens or even hundreds of thousands of lives? -- http://mail.python.org/mailman/listinfo/python-list