Is there a way for links which have expired to be removed from a feed or is that something that would have to be controlled by the site offering the feed?
For example, on the site listed above there is a list of sales from Tiger Direct. If you click on each link it will take you to the page for that sale, but if the sale has expired it will take you to their home page instead and therefore I'd like to just remove that link.
If it's something that would have to be controlled by tiger direct in this example then would there be a way that I can have the link deleted/hidden once the landing page goes to a certain link?
By "expired" you're not talking about the WSN expiration feature?
I suppose I could build a new link checker option to find pages that are redirecting, but then you'll not want delete most redirects, and you'll have to run it manually, and it'll take ~5 hours / $250. Adding a per-link option to specify what URL is a "bad redirect" for that link would add another 2 hours.
Yes, you're correct, I didn't mean WSN expiration. But let me ask you this. Is there a way to have links that are being pulled in from a feed to automatically delete after a period of time? So, that any links added to the directory today will expire(delete) 2 weeks from today, but any links added to the directory next week will expire 2 weeks from that date. Is something like what I'm describing feasible?
WARNING - DO NOT RUN THE ABOVE CODE WITHOUT A BACKUP FIRST!
Thanks for trying Paul, but when I ran the above as a cron it deleted all the links in my directory. Thank goodness I had a backup that was a few days old so it wasn't a tragedy, but I want to make sure no one else runs it without making sure they have a backup on hand.
Actually, that is not the case. The only links from a feed are the ones in one subcategory although most links may be over 2 weeks old, I don't remember off hand. But my wsn_links table was completely cleaned out after running the cron.
In any case, I'm not pursuing this anymore so don't worry about it, I just wanted to make sure no one happened to come across the code and get burned.
Comments on feed with expiring links
Forum Regular
Usergroup: Customer
Joined: Apr 03, 2007
Location: NY & PA
Total Topics: 94
Total Comments: 339
Is there a way for links which have expired to be removed from a feed or is that something that would have to be controlled by the site offering the feed?
For example, on the site listed above there is a list of sales from Tiger Direct. If you click on each link it will take you to the page for that sale, but if the sale has expired it will take you to their home page instead and therefore I'd like to just remove that link.
If it's something that would have to be controlled by tiger direct in this example then would there be a way that I can have the link deleted/hidden once the landing page goes to a certain link?
developer
Usergroup: Administrator
Joined: Dec 20, 2001
Location: Diamond Springs, California
Total Topics: 61
Total Comments: 7868
By "expired" you're not talking about the WSN expiration feature?
I suppose I could build a new link checker option to find pages that are redirecting, but then you'll not want delete most redirects, and you'll have to run it manually, and it'll take ~5 hours / $250. Adding a per-link option to specify what URL is a "bad redirect" for that link would add another 2 hours.
Forum Regular
Usergroup: Customer
Joined: Apr 03, 2007
Location: NY & PA
Total Topics: 94
Total Comments: 339
Yes, you're correct, I didn't mean WSN expiration. But let me ask you this. Is there a way to have links that are being pulled in from a feed to automatically delete after a period of time? So, that any links added to the directory today will expire(delete) 2 weeks from today, but any links added to the directory next week will expire 2 weeks from that date. Is something like what I'm describing feasible?
developer
Usergroup: Administrator
Joined: Dec 20, 2001
Location: Diamond Springs, California
Total Topics: 61
Total Comments: 7868
I believe this as a cron or modification file would do it:
Forum Regular
Usergroup: Customer
Joined: Apr 03, 2007
Location: NY & PA
Total Topics: 94
Total Comments: 339
WARNING - DO NOT RUN THE ABOVE CODE WITHOUT A BACKUP FIRST!
Thanks for trying Paul, but when I ran the above as a cron it deleted all the links in my directory. Thank goodness I had a backup that was a few days old so it wasn't a tragedy, but I want to make sure no one else runs it without making sure they have a backup on hand.
developer
Usergroup: Administrator
Joined: Dec 20, 2001
Location: Diamond Springs, California
Total Topics: 61
Total Comments: 7868
If it deleted everything, that means all your links are from feeds and more than 2 weeks old...
Forum Regular
Usergroup: Customer
Joined: Apr 03, 2007
Location: NY & PA
Total Topics: 94
Total Comments: 339
Actually, that is not the case. The only links from a feed are the ones in one subcategory although most links may be over 2 weeks old, I don't remember off hand. But my wsn_links table was completely cleaned out after running the cron.
In any case, I'm not pursuing this anymore so don't worry about it, I just wanted to make sure no one happened to come across the code and get burned.
developer
Usergroup: Administrator
Joined: Dec 20, 2001
Location: Diamond Springs, California
Total Topics: 61
Total Comments: 7868
Very odd. For no apparent reason, you have to use parentheses:
$db->delete('linkstable', 'xmlsource > 0 AND time < '. (time() - 86400*14));