Webmastersite.net
Register Log In

Script causing host to max out CPU
utilization.

Comments on Script causing host to max out CPU

badgoat
Member

Usergroup: Customer
Joined: Nov 13, 2005

Total Topics: 7
Total Comments: 14
badgoat
Posted Dec 10, 2005 - 11:44 AM:

Hello,

My host shut down my account and said that I was maxing out their CPU utilization. What can I do to the script to make it run with efficiency? I asked how many connections they had at the time and they told me 75..

I pruned almost everything in the 'Switches' area, shutting off just about all of the ones which were on by default. What else can I do?
peumus
Forum Regular

Usergroup: Customer
Joined: Aug 09, 2004
Location: Chile

Total Topics: 172
Total Comments: 462
peumus
Posted Dec 10, 2005 - 7:40 PM:

I would say the solution is here:

https://www.webmastersite.net/forums/thread/5827

I do also want to start working on it soon.
badgoat
Member

Usergroup: Customer
Joined: Nov 13, 2005

Total Topics: 7
Total Comments: 14
badgoat
Posted Dec 11, 2005 - 6:32 AM:

According to that thread, it's the toplists that are causing his server load. I only have one running on mine (if I am understanding what a toplist is) On my site I have the top 5 most recently added links... Is that one of the toplists? What are the others? And where may I turn them off?
Paul
developer

Usergroup: Administrator
Joined: Dec 20, 2001
Location: Diamond Springs, California

Total Topics: 61
Total Comments: 7868
Paul
Posted Dec 11, 2005 - 3:19 PM:

Check the execution time of the average page (debug mode). Something around 1 second would be normal, much over that indicates an issue.

Toplists aren't the only possibility, and if you haven't added any they're unlikely, it's custom-added toplists that tend to cause load. Consider scripts.webmastersite.net/w...ry_size_exhausted-209.html

Also, the usage of html generation without selecting the "distribute load" option selected can cause the CPU usage to spike to 100%.

I asked how many connections they had at the time and they told me 75..

If that means 75 people on the site at the same time, that seems like quite a lot. It could be an unfriendly bot spidering the site at an unhealthy rate.
badgoat
Member

Usergroup: Customer
Joined: Nov 13, 2005

Total Topics: 7
Total Comments: 14
badgoat
Posted Dec 13, 2005 - 6:45 AM:

Hi Paul,

I checked the execution time, and it ranged from .8 sec to 4.8 sec without me having changed anything or done anything. With fewer switches enabled the execution times are at the lower end of the spectrum.

I am not familiar with the HTML generation. If it is unused by default, or "distribute load" is on by default, then that isn't an issue for me. Where in the admin panel may I look into these settings?

It may very well be an unfriendly spider, as I submitted the site to around 100 search engines. Site traffic went from nothing (5 people on the site at once) to a fair amount (154 people at the same time).

I only have 250 links and absolutely nothing has been customized.. Except turning off switches to reduce the server usage.

My host is not very friendly, they turn the account off without telling any detail other than the CPU hit 100% utilization. They offer no debug information.

What else may I check? Anyone suggest a friendlier, more tolerant, better equipped host?
Paul
developer

Usergroup: Administrator
Joined: Dec 20, 2001
Location: Diamond Springs, California

Total Topics: 61
Total Comments: 7868
Paul
Posted Dec 13, 2005 - 4:23 PM:

Where in the admin panel may I look into these settings?

'Generate HTML' is toward the bottom of the menu. HTML generation is off by default though, so if you didn't turn it on it isn't on. It offers a possible (if not too attractive) solution as well, you could generate an html version of your directory and that would reduce the load a lot (except at the moment when it's generating, which as mentioned causes a load spike).

I presume you have a robots.txt... WSN should have one by default that sets a crawl-delay of 20. Might try increasing that to 60, though the problem could just be less friendly bots that don't obey a crawl-delay value.

I have a *nix-only method for locking out guests after the server load gets too high, but that wouldn't be good when most people aren't registered. In the future perhaps I'll have something to lock out known spiders depending on server load. Though that'll still require figuing out the user agents of all the spiders involved.

http://www.spenix.com hosts this site and was quite friendly with a high traffic forum I had there before as well.

If you want to not have to worry about hosts complaining of resource usage at all, a virtual private server is able to limit the resources available to you so that your site merely becomes slow and nobody else is affected so the host won't care. A VPS is also expensive though. I use http://www.tektonic.net , couldn't say I'm thrilled with them but they're okay.
Search thread for
Download thread as
  • 0/5
  • 1
  • 2
  • 3
  • 4
  • 5



This thread is closed, so you cannot post a reply.