Webmastersite.net
Register Log In

Static File Generation
Can I cronjob it?

Comments on Static File Generation

Synozeer
Forum Regular

Usergroup: Customer
Joined: Jun 02, 2004

Total Topics: 32
Total Comments: 142
Posted Nov 23, 2005 - 2:30 PM:

As traffic to my site running WSN grows, the server it's on started coming to a crawl. Some testing revealed that the 3 toplists on my index page were the culprist, sometimes bringing my load up to 40+. As soon as I took the toplists down, the load went away.

A while back I tried converting one of the toplists into a static file, but whenever the static files were created, it would cause that file to be a 0 byte file. My guess is that because the automatic static file generation routine is going off based on someone visiting the site at the time that routine is supposed to run (instead of going off based purely on a cronjob), the routine is getting run more than once from lots of traffic hitting the site at around that exact moment.

I've actually seen the static file being created, then a few moments later a 0 byte file, then created, and then a 0 byte file as the routine is run multiple times.

Since I don't think there's a fix for this other than rewriting that portion of the code entirely, I'd like to know how I can set up a cronjob to create the static files instead of using the WSN built in automatic static generation routine. An additional bonus is that I'll be able to run it more than once every 12 hours.

Thanks,
Adam
Paul
developer

Usergroup: Administrator
Joined: Dec 20, 2001
Location: Diamond Springs, California

Total Topics: 61
Total Comments: 7868
Paul
Posted Nov 23, 2005 - 5:38 PM:

Do you mean a WSN cron or a server cron? I'll guess server. A wget of
require 'start.php';
if ($settings->staticexporturllist != '')
{
$lists = explode("\n", decodeit(stripslashes($settings->staticexporturllist), 'full'));
$names = explode("\n", $settings->staticexportnamelist);
$n = sizeof($lists);
for ($x=0; $x<$n; $x++)
{
$url = trim($lists[$x]);
if ($inadmindir) $writeto = '../'; else $writeto = '';
$writeto .= trim($names[$x]);
@chmod($writeto, 0666);
if (strstr($url, '?')) $url .= '&exporting=1'; else $url .= '?exporting=1'; // make sure it doesn't get into loop
$content = geturl($url);
$writeit = filewrite($writeto, $content);
}
}
require 'end.php';

and then emptying commonfuncs.php's exportstaticpages function to become
function exportstaticpages()
{
return true;
}

should do it I suppose, though I've never actually set a cronjob in my life.

the routine is getting run more than once from lots of traffic hitting the site at around that exact moment.

Possible. Hard to verify. Harder to fix since there's no locking mechanism like with mysql.
Synozeer
Forum Regular

Usergroup: Customer
Joined: Jun 02, 2004

Total Topics: 32
Total Comments: 142
Posted Nov 24, 2005 - 11:28 PM:

Here's the latest on this:

I let WSN links try to do the static html by itself (5 jobs - 4 are toplists, 1 is RSS), but it only got through the 1st one before crashing Apache both times. When I got a look at my Apache status while this was happening, it showed one of the links that was pasted in the Static HTML box about 30+ times, so it was indeed trying to run the same routine over and over again.

I then created a cronjob on my server to run the above snippet and empited out commonfuncs.php's exportstaticpages routine. This appeared to work correctly... half of the time. The other half of the time, it would create all the files listed in the Static HTML section (which is correct), but then a minute or so later, all sites on the server would stop responding. Server load appeared normal, but when I got a look at active processes and services, NOTHING was showing up - no qmail, no kswapd, or anything else that normally runs. After 15 minutes, all sites were accessible again as if nothing happened. Apache did not go down though.

My cron job:

45 5 * * * /usr/bin/php /PATH/TO/PHP SNIPPET >/dev/null 2>&1

My Static HTML jobs (with stuff in CAPS having been renamed for posting):

http://www.MYSITE.com/rssfeed.php?type=links&number=20&field=time&ascdesc=descending&thecondition=type='regular'&title=MYSITE
www.MYSITE.com/index.php?cu...]=descending&type[1]=links
www.MYSITE.com/index.php?cu...]=descending&type[1]=links
www.MYSITE.com/index.php?cu...]=descending&type[1]=links
www.MYSITE.com/index.php?cu...&number[1]=8&field[1]=rand()&ascdesc[1]=descending&type[1]=links


(each of these also obviously have an accompanying filename that they are saved as)

I didn't use wget - should I be? From the research I did, wget didn't seem to apply here.

The most interesting thing is that when I call the snippet from my browser, it works every time without problems. It's only from the cronjob that I have this problem.

EDIT: I did a little more research and saw how to properly use WGET. I tried it out, and it appears to be working every time. For some reason, doing the cronjob like I have it causes problems. Here's what I'm now using:

45 * * * * wget -O - http://www.MYSITE.com/PHP SNIPPET

Adam
Synozeer
Forum Regular

Usergroup: Customer
Joined: Jun 02, 2004

Total Topics: 32
Total Comments: 142
Posted Nov 26, 2005 - 3:59 PM:

I just wanted to emphasis that this change made a HUGE HUGE HUGE difference. My site (and server) running WSN Links is faster by many multiples. The toplists were taking up a gigantic amount of processing power on my server, and even though I wanted them as Static Files, I couldn't because when the Static Files were being created, it would keep trying to process each hundreds of times, causing my server to crash. Switching to a server cronjob not only fixes that problem, but you can now update your Static HTML pages as often as you want - I'm updating once an hour now.

I figured that I'd clarify my results so that anyone who is having speed issues with their site could attempt to fix it.

-Adam
Search thread for
Download thread as
  • 0/5
  • 1
  • 2
  • 3
  • 4
  • 5



This thread is closed, so you cannot post a reply.