The problem is, will you ever have access to a high traffic site to test this on, or will it forever be pushed back to the "cannot reproduce" list?
I've got robots.txt in place already, but not all search engines use crawl-delay. Plus, I don't think that's what's causing the majority of these problem. I notice I get a lot of my crashes when WSN Links is doing it's every 12 hours updating for static pages, RSS Feeds, and the like. The email problem where it sends duplicates happens at all times of the day and night.
well Paul, if you remember I had a problem after upgrading form 2.53 which ran fast and easy with 8database queries, now it is 28 and we had a discussion about it similar to this one...
well i just moved the site to dedicated box and it speeded up things a bit but not for long.
i just desided to check if you had figured out the way to fix that cause it took 100.0689378 sec to generate index.page This is a Bit too much!
28 database queries is perfectly normal (8 was probably a bug of them not being totaled correctly), and on any normal server can run very fast, most of the execution time is PHP rather than MySQL. 100 seconds is simply too absurd to say anything about, obviously you're doing something or other very unusual, not properly upgrading or something of that sort. I can't tell you what you're doing since you haven't provided any info (no debug), but as you can see at http://links.webmastersite.net/?debug=4 , I get 0.31 seconds. If you check Synozeer's site, he's getting a reasonable 1.05 seconds (0.72 inside categories).
Actually, 3.x is considerably faster than 2.5x for large sites as 2.x had several problems of trying to calculate data dynamically that'd be a big problem for a large database.
Synozeer wrote: will you ever have access to a high traffic site to test this on
My forums have 9,200 members, 273,000 posts and about 15 members and another 15 guests online at any average time. The forum script uses the same code (even if newer and a few extra little files), things that would affect a large Links site would also affect a large Forums site. The only load issues I've ever had with it are the same ones I had with vBulletin previously (which I used WSN Forum to bring under control, actually).
I'm sure there must be at least dozens of other users with traffic similar to yours (securitydocs for example ranks near you in alexa). So, once again, the missing piece is what makes your site unique.
Anyhow to anyone with email problems, turn off the queue and there will be no script/mysql involvement and it will rest squarely on how your server's mail function performs.
This article would also be applicable to 3.2, but should not be necessary.
My forums have 9,200 members, 273,000 posts and about 15 members and another 15 guests online at any average time. The forum script uses the same code (even if newer and a few extra little files), things that would affect a large Links site would also affect a large Forums site. The only load issues I've ever had with it are the same ones I had with vBulletin previously (which I used WSN Forum to bring under control, actually).
Not necessarily. My site has anywhere from 50-150 users on at any one time. Active users of the site are very different than just having a lot of posts/links created, although they may also contribute. The more users on the site, the more load and queries created.
Anyhow to anyone with email problems, turn off the queue and there will be no script/mysql involvement and it will rest squarely on how your server's mail function performs.
I turned off the queue and I'm still getting duplicate emails. Other sites on my server send out emails and none of them result in duplicates. This only happens with the site running WSN Links.
I'm sure there must be at least dozens of other users with traffic similar to yours (securitydocs for example ranks near you in alexa). So, once again, the missing piece is what makes your site unique.
Actually, securitydocs posted that he has many of the same performance problems that I do in this very thread. See mrowton's post from Jun 3 on page 2.
Adam
0/5
1
2
3
4
5
This thread is closed, so you cannot post a reply.
Comments on WSN Links very resource intensive?
Forum Regular
Usergroup: Customer
Joined: Jun 02, 2004
Total Topics: 32
Total Comments: 142
The problem is, will you ever have access to a high traffic site to test this on, or will it forever be pushed back to the "cannot reproduce" list?
I've got robots.txt in place already, but not all search engines use crawl-delay. Plus, I don't think that's what's causing the majority of these problem. I notice I get a lot of my crashes when WSN Links is doing it's every 12 hours updating for static pages, RSS Feeds, and the like. The email problem where it sends duplicates happens at all times of the day and night.
Adam
Forum Regular
Usergroup: Customer
Joined: Feb 19, 2003
Total Topics: 23
Total Comments: 106
well Paul, if you remember I had a problem after upgrading form 2.53 which ran fast and easy with 8database queries, now it is 28 and we had a discussion about it similar to this one...
well i just moved the site to dedicated box and it speeded up things a bit but not for long.
i just desided to check if you had figured out the way to fix that cause
it took 100.0689378 sec to generate index.page This is a Bit too much!
We need help here!
developer
Usergroup: Administrator
Joined: Dec 20, 2001
Location: Diamond Springs, California
Total Topics: 61
Total Comments: 7868
28 database queries is perfectly normal (8 was probably a bug of them not being totaled correctly), and on any normal server can run very fast, most of the execution time is PHP rather than MySQL. 100 seconds is simply too absurd to say anything about, obviously you're doing something or other very unusual, not properly upgrading or something of that sort. I can't tell you what you're doing since you haven't provided any info (no debug), but as you can see at http://links.webmastersite.net/?debug=4 , I get 0.31 seconds. If you check Synozeer's site, he's getting a reasonable 1.05 seconds (0.72 inside categories).
Actually, 3.x is considerably faster than 2.5x for large sites as 2.x had several problems of trying to calculate data dynamically that'd be a big problem for a large database.
will you ever have access to a high traffic site to test this on
My forums have 9,200 members, 273,000 posts and about 15 members and another 15 guests online at any average time. The forum script uses the same code (even if newer and a few extra little files), things that would affect a large Links site would also affect a large Forums site. The only load issues I've ever had with it are the same ones I had with vBulletin previously (which I used WSN Forum to bring under control, actually).
I'm sure there must be at least dozens of other users with traffic similar to yours (securitydocs for example ranks near you in alexa). So, once again, the missing piece is what makes your site unique.
Anyhow to anyone with email problems, turn off the queue and there will be no script/mysql involvement and it will rest squarely on how your server's mail function performs.
This article would also be applicable to 3.2, but should not be necessary.
Forum Regular
Usergroup: Customer
Joined: Jun 02, 2004
Total Topics: 32
Total Comments: 142
Not necessarily. My site has anywhere from 50-150 users on at any one time. Active users of the site are very different than just having a lot of posts/links created, although they may also contribute. The more users on the site, the more load and queries created.
I turned off the queue and I'm still getting duplicate emails. Other sites on my server send out emails and none of them result in duplicates. This only happens with the site running WSN Links.
Actually, securitydocs posted that he has many of the same performance problems that I do in this very thread. See mrowton's post from Jun 3 on page 2.
Adam