About: This addon takes just moments to install, it gathers your forum topic links and exports them as a sitemap that you can add to Google, yahoo, or just about any search engine which utilizes sitemaps.
The configuration options allow you to specify which type of sitemap you wish to use and weather you wish to plugin threaded or flat view. The sitemap type and view type can both be set either hard coded or via url string.
Simply enter the URL to your sitemap install into your Google sitemap list (here) and google will automatically grab copies of your thread links every week.
What are the security aspects?: Well, the sitemap will ignore threads in private forums (if a guest cannot *see* them then the sitemap will not publish the link). Also, threads that are not approved will not show either (same aspect as above).
What is a sitemap?: A sitemap will take and provide all of your current threads to search engines, thus eliminating the wait of "new discovery" of new pages on your forums.
You can read about the benefits of using sitemaps via the below quoted article from Google Webmaster Tools.
Please note that this addon does not create a sitemap for your entire site, and only creates one for your forum.
Demo: As this addon doesn't utilize a cache, no active demo is available, however you can see example output via the attachment screen shots.
Install Instructions: The install is quite simple, after receiving the software via email simply open the php file and place your MySQL connect information where indicated. Now upload the file to your web root (not your forum directory) and then simply open the script via the web and you'll see the default XML (google valid) sitemap.
Usage Instructions: By default, the script uses flat mode and the XML map. These defaults can be easily changed from within the script itself.
As for web-based switches: http://www.example.com/sitemap.php?type=&view=&se= // Optional Paramater -- se=# Where # weather or not to use SE Friendly URLs (0/No, 1/Yes) // Optional Paramater -- type=# Where # is a map type (1/XML, 2/Text, 3/html, 4/RSS, 5/ASP, 6/SitemapIndex) // Optional Paramater -- view=# where # is your view type (1/ShowFlat, 2/ShowThreaded) // Optional Paramater -- offsets=# where # is weather or not to use offsets.
Submissions: Not all search engines will automagically re-check your sitemap; in fact, I've only noticed that Google will re-check on its own. You'll want to go to Yahoo or anyone else every couple of weeks and tell them to re-grab a copy of your map.
Attachment Information: sitemap-xml.jpg - An XML map. sitemap-rss.jpg - An RSS map. sitemap-text.jpg - A simple "text link list" map. sitemap-link.jpg - A simple "html link list" map. sitemap-asp.jpg - An ASP style sitemap. sitemap-index.jpg - A screen cap of the Sitemap Index feature.
Pricing: $50 per install (each separate install requires a separate license) which includes 1 year of free updates.
How to Buy: You can order via our Script Information page here (there is an order form towards the bottom for PayPal payments).
Notice: This addon is not freeware, and as such you cannot freely provide it to others without prior written permission from its author(s). Permission was obtained from the UBB Dev administration to post this non-free addon's information in advance.
A Sitemap file lets you tell us about all the pages on your site, and optionally, information about those pages, such as which are most important and how often they change. By submitting a Sitemap file, you can take control of the first part of the crawling/indexing processes: our discovery of the pages.
This may be particularly helpful if your site has dynamic content, pages that aren't easily discovered by following links, or if your site is new and has few links to it.
Sitemaps help speed up the discovery of your pages, which is an important first step in crawling and indexing your pages, but there are many other factors that influence the crawling/indexing processes. Sitemaps lets you tell us information about your pages (which ones you think are most important, how often the pages change), so you can have a voice in these subsequent steps. Other factors include how many sites link to you, if your content is unique and relevant, if we can crawl the pages successfully, and everything outlined in our webmaster guidelines.
A Sitemap provides an additional view into your site (just as your home page and HTML site map do). This program does not replace our normal methods of crawling the web. Google still searches and indexes your sites the same way it has done in the past whether or not you use this program. Sites are never penalized for using this service. This is a beta program, so we cannot make any predictions or guarantees about when or if your URLs will be crawled or added to our index. Over time, we expect both coverage and time-to-index to improve as we refine our processes and better understand webmasters' needs.
Once you have created your Sitemap, you can add it to your Google Sitemaps account and update it as your site changes.
Licensed Sites: 33 (not including my own or ubbdev)
Those are kind of out of scope I think, it's designed to just pull the threads themselves, not forum topic indexes (which will likely be in a future build).
I'd have to see how you're calling these extra pages (header/footer) to be able to give a for sure answer.
I'm pretty sure that I could make an index page which could take all files in a directory and build a sitemap for you; but keep in mind that if it's just a custom header and footer that google won't really care; you're not feeding your content to them, you're simply feeding urls of your threads to them; the feed does not include any post content (as they don't want it).
Another product i'll be coming out with soon is a simple sitemap lister that you can manually enter urls on your site; but I'm working on a couple of other projects until I can get to that.
I've been toying with Curl for building automagic sitemaps, I have something that works for the most part, but you have to think that php has an execute script limit which would make it die prematurely on large sites (or forums for example).
Since I moved from 6.5 to 7.X Google has all but stopped crawling my threads/topics and as a result the numbber of new members I get has dropped dramatically! I hope this will help (should I decide to buy it)!
I Have recently moved my forums to a new Domain. So it was important to me to get my site under its new Domain name back into google as soon as i could.
Gizmo has been raving about the site map program he made, so it thought to my self (He seems like a nice chap) I think i will give it a go and see if it really does work.
The answer is YES
21 days after i installed the site map Google has listed my site under its new domain name with 6,000 + links. Not all my forums are public
It is fair to say what would have happened if i didn't have the site map installed? Well i cant answer that to be honest as I'm now listed. I like to think that it has helped in my situation where a Domain move for my forums.
The Site Map has worked for me, but if you have any questions about it functionality then please contact Gizmo via PM as i can only report back the results i have.
Gizmo's site map isn't free, but for my needs worth the money. You will also need a Google web masters account. It also works with Yahoo.
v0.3 has been completed this morning. I've done a lot of code cleanup, on both a security and code standpoint. Instead of "assuming" data passed to the script from the database is "valid" for feeds, we run subjects through htmlentities (thus validating code) and we force "int" on fields which should only contain numbers.
I've repaired a problem with the offsets option introduced in 0.2 which allows you to split your one large sitemap into several easier to manage sitemaps (thus allowing you the opportunity to not go over the 50k link limit that some services enforce.
I have also built onto the offsets option and built in the ability to utilize a sitemap index which will list any "valid" sitemaps on from your forum (built using this script only); everything will be done automatically, all you need to do is submit the sitemap index into google (or a search engine which support them) and when you tell them to look at your site, they will grab your list of maps and work on them accordingly.
I have also set the "offsets" option to off by default; when I introduced it in 0.2 I set it to "on" by default.
Coming in 0.4: I'm looking to add the ability to base "importance" to urls based on their date posted; this will allow the newer data to have a larger importance than the older data, thus allowing it to be crawled more often than older threads.
I'm also looking to allow the maximum thread inclusion to be adapted in the URL as well; however this ability will not be able to go over the scripts hardcoded 50k limit.
Also, if you've got a suggestion for expansion, please let me know, I'm always looking at new ideas.
If you're looking to upgrade, please email me (james[at]virtualnightclub[dot]net) with your licensed url's and I'll send off the new files.
If you're looking to purchase license please feel more than free to email me for more information.
Custom pages are out of scope for this (at least at this point), I COULD code it in but cost vs demand... There would have to be a way to add items into the map and since the current manner just auto-pulls data from the database there is no real "need" for the ability to add items (short of posting more content to your forums)
I do however have plans to write a separate sitemap utility for custom/standard pages; this script since i have to hop through hoops to cater to limitations (filesize and total links) it would be difficult to factor in custom pages, the next product will gear just towards your "standard site" items.
You could always try adding them to the sitemap manually (by hacking it in) so long as you're sure you won't bust over the 50k links/page limitation.
Q. Can I 'select' which forums I want to include on the sitemap? A. No, it takes any "public" forum and creates a map for posts in it.
There is a valid reason for this; either way google will eventually crawl all data, a sitemap just helps it along; either way the data will be there (sooner or later).
Q. How is your upgrade police for your program versions and for UBB versions A. Upgrades, at this point, are free of charge to valid license holders; I upgrade as the UBB upgrades (for example, as the page variable was redundant, it was also purged in 0.3). The upgrade pricing may change in the future, but at this point I have no plans of charging for upgrades. A. Upgrades are free for 1 year (365 days), after which you can purchase upgrades at $25/yr.
To receive an upgrade to the UBB.Sitemapper, all you need to is email me after there is one available and I'll email it back; however I do request that when emailing me you also email the site that you've purchased it for so i can do internal auditing (for a possible download manager in the future).
I've been putting some finishing touches on 0.5; the current changelog is:
-- 0.5 -- Removed "Priority" tag as it was optional and providing duplicate errors from Google. Removed some "test code" from the header of the script left in from 0.4. Fixed a bugglet with the "read from config" option which was added in 0.4. Added a "tick" to config which allows you to display the optional "changefreq" tag. Added the ability to pull the community name from your ubb config as the feed name. Updated the changelog format to fix spelling and grammatical errors introduced in 0.3.
If you've noticed an inconvenience or would like to see a change (or update) please let me know so I can get it in! Feel free to email me as well (james[at]virtualnightclub[dot]net)
Use the sitemap index option, it'll create an index of x amount of sitemaps, each map would be X size based on the paramater in the sitemap script...
EX: You have 100000 threads, you set 25k links in the sitemap, so the index would push out an index with 4 sitemaps, each sitemap would push an offset of 25k, so the first would be 0-25k, second 2501-50k, etc...
I'd recommend using no more than say 25k chunks however, makes the query a little easier to deal with lol...
Q1 -I've just moved my site to a new host, and its now under a new domain with the old domains redirecting to it (for now at least). However the new domain has 2 variants (a DOT com and a DOT co DOT uk) the later just points back to the former. Do i need 2 licenses or 1?
1. just install it for the primary; you really should have 1 domain with others directing to it for better results in engines (duplicate anything brings one down in points).
2. Yes, the script itself is completely separate from the UBB, as it only relies on standard data in the db it should be compatible with just about any install within v7.
Sweet Point taken on the redirects - i've just moved a load of stuff over from another hosting company and registrar so things are a little disorganized at the moment. Ultimately everything will be redirected to a single point.