About: This addon takes just moments to install, it gathers your forum topic links and exports them as a sitemap that you can add to Google, yahoo, or just about any search engine which utilizes sitemaps.
The configuration options allow you to specify which type of sitemap you wish to use and weather you wish to plugin threaded or flat view. The sitemap type and view type can both be set either hard coded or via url string.
Simply enter the URL to your sitemap install into your Google sitemap list (here) and google will automatically grab copies of your thread links every week.
What are the security aspects?: Well, the sitemap will ignore threads in private forums (if a guest cannot *see* them then the sitemap will not publish the link). Also, threads that are not approved will not show either (same aspect as above).
What is a sitemap?: A sitemap will take and provide all of your current threads to search engines, thus eliminating the wait of "new discovery" of new pages on your forums.
You can read about the benefits of using sitemaps via the below quoted article from Google Webmaster Tools.
Please note that this addon does not create a sitemap for your entire site, and only creates one for your forum.
Demo: As this addon doesn't utilize a cache, no active demo is available, however you can see example output via the attachment screen shots.
Install Instructions: The install is quite simple, after receiving the software via email simply open the php file and place your MySQL connect information where indicated. Now upload the file to your web root (not your forum directory) and then simply open the script via the web and you'll see the default XML (google valid) sitemap.
Usage Instructions: By default, the script uses flat mode and the XML map. These defaults can be easily changed from within the script itself.
As for web-based switches: http://www.yoursite.tld/sitemap.php?type=&view=&se= // Optional Paramater -- se=# Where # weather or not to use SE Friendly URLs (0/No, 1/Yes) // Optional Paramater -- type=# Where # is a map type (1/XML, 2/Text, 3/html, 4/RSS, 5/ASP, 6/SitemapIndex) // Optional Paramater -- view=# where # is your view type (1/ShowFlat, 2/ShowThreaded) // Optional Paramater -- offsets=# where # is weather or not to use offsets.
Submissions: Not all search engines will automagically re-check your sitemap; in fact, I've only noticed that Google will re-check on its own. You'll want to go to Yahoo or anyone else every couple of weeks and tell them to re-grab a copy of your map.
Version Additions: v0.10 (October 26th, 2015) • All queries have been cleaned up, the tally that determines a SitemapIndex's pages is done entirely server side as a single database query. Additionally the query options where combined and rewritten so that one query loads all of the needed data for the building of each Sitemap. • Several configuration options where changed in both v0.9 and v0.10, it is recommended that you compare the options at the top of your page from your current build to the newest version that you wish to upgrade to so that you can see the changes in options and defaults. • We now recommend everyone run as a SitemapIndex with offsets enabled as it makes your site future-proof to the amount of data you have in your Sitemap as well as break the data up into manageable chunks that should negate some server strain on large forums. • Added support for date_default_timezone_set; you can now set the timezone for the generation times supplied by your sitemap (defaults to Pacific Standard Time for those who do not set an option). • Support for virtual SSL over a CDN using "HTTP_X_FORWARDED_PROTO". • Fixed a bug with caching. • Fixed a time issue with the formatting previously used. • Removed the recommendation of blocking user profiles in the robots.txt file as there could be some potentially valuable information as far as data is concerned.
v0.9 (April 02, 2015)
All configuration items are now stored in a $conf array.
ASP sitemaps no longer populate both the title and description tags as they where duplicated and thus added to the bulk of sitemap pages.
General script code cleanup, reformatting, and restructuring.
The test string is now type 0; previously, type 0 was not utilized and would cause errors if indicated. Type 7 no longer works for any type of generation.
We're now utilizing more current sitemap standards; previously we where utilizing earlier versions.
There is now support for v7.6.0 style SEO URLs
The site name parameter is no longer passed to the thread title as this just added a lot of extra bulk to the pages.
The last reply time of a topic is now utilized in XML sitemaps.
Utilization of an XML Stylesheet is now supported for type 1; see the supplied css/sitemap.css file for basic options.
For compatibility for users hosting their UBB.threads install in the web root, the $conf["folder"] variable now requires a trailing slash.
RSS Sitemaps now utilize the tag which is fed from the $conf["cache"]["time"] variable.
Code Cleanup (no more Undefined variables)
More efficient handling of SSL detection
Preview no longer allows cache (as it should always be "live" for testing)
SE Friendly urls now properly match the UBB's SE Friendly urls (including a more powerful SEO score for URL structure).
You can now set the frequency that you want Search Engines to check back for new content using the $use_frequency tick.
v0.6 - This is known as the Caching and Optimization update since improvements where added to increase the speed of your install.
Added a simple time-based caching system which will allow you to set how long the output is to be cached; this cache is built on the first visit to the script of each url type (if a value is changed in the requesting url it'll result in a separate caching of that output of data). The default caching time is 24 hours (this value is entered in seconds).
Added an expires header to the code to allow client browsers (or search engines (if they support it) to cache the output of the sitemap script to their local machine for a defined period of time (default 24 hours, entered in seconds).
Added the ability to serve the output of the script via GZip to aid in download times (basically serving a compressed dataset to the user). GZip has been known to cause some minor CPU load, but most virtual host providers should be more than capable of serving content.
Note that some of these items may not work in your environment, hence why we added several options. You can use all of them, or none of them, or any portion of them that you wish; they're capable of running independently or in conjunction with each other.
Fixed numerous cosmetic display issues and fixed a bug that's been in since ~v0.2 which made it so that the "blank" display didn't parse any data (vs falling back to the defaults you display).
v0.5 - Removed "Priority" tag as it was optional and providing duplicate errors from Google.
Removed some "test code" from the header of the script left in from 0.4.
Fixed a bugglet with the "read from config" option which was added in 0.4.
Added a "tick" to config which allows you to display the optional "changefreq" tag.
Added the ability to pull the community name from your ubb config as the feed name.
Updated the changelog format to fix spelling and grammatical errors introduced in 0.3.
The db connection data no longer has to be hand written into the ubb sitemap script; so long as you set the path to your UBB Configuration directory and have "connection" set to 1, it will read the connection data from your UBB configuration file.
Additionally the Preview URL set in the "usage" area of this readme has been updated to reflect the 0.3a update. Also included in this update is compatibility with the permission update for 7.3.
v0.3 - Support for SiteMap Indexes has been built in; this works well with the "offset" feature added in v0.2. The sitemap index will look at the xml sitemap data and build the proper number of "valid" links to offset pages. This is extremely useful for larger forums as they can set what level they want to offset their data (as to not go over the 50k maximum link limit with most providers) and they just need to submit one url (versus numerous to cover additional sitemap pages.
v0.2 - This version introduces offsetting; Google has a maximum of 50k links allowed in a sitemap, some larger (huge) forums have more than 50k threads on their forums. This setting is enabled by default, and can be utilized by using the offset variable in your urls submitted to google/yahoo. Additionally, it can be disabled completely in the configuration file.
Attachment Information: sitemap-xml.jpg - An XML map. sitemap-rss.jpg - An RSS map. sitemap-text.jpg - A simple "text link list" map. sitemap-link.jpg - A simple "html link list" map. sitemap-asp.jpg - An ASP style sitemap. sitemap-index.jpg - A screen cap of the Sitemap Index feature.
Pricing: $50 per install (each separate install requires a separate license) which includes 1 year of free updates ($40/yr updates thereafter).
How to Buy: You can order via our Script Information page here (there is an order form towards the bottom for PayPal payments).
Notice: This addon is not freeware, and as such you cannot freely provide it to others without prior written permission from its author(s). Permission was obtained from the UBB Dev administration to post this non-free addon's information in advance.
A Sitemap file lets you tell us about all the pages on your site, and optionally, information about those pages, such as which are most important and how often they change. By submitting a Sitemap file, you can take control of the first part of the crawling/indexing processes: our discovery of the pages.
This may be particularly helpful if your site has dynamic content, pages that aren't easily discovered by following links, or if your site is new and has few links to it.
Sitemaps help speed up the discovery of your pages, which is an important first step in crawling and indexing your pages, but there are many other factors that influence the crawling/indexing processes. Sitemaps lets you tell us information about your pages (which ones you think are most important, how often the pages change), so you can have a voice in these subsequent steps. Other factors include how many sites link to you, if your content is unique and relevant, if we can crawl the pages successfully, and everything outlined in our webmaster guidelines.
A Sitemap provides an additional view into your site (just as your home page and HTML site map do). This program does not replace our normal methods of crawling the web. Google still searches and indexes your sites the same way it has done in the past whether or not you use this program. Sites are never penalized for using this service. This is a beta program, so we cannot make any predictions or guarantees about when or if your URLs will be crawled or added to our index. Over time, we expect both coverage and time-to-index to improve as we refine our processes and better understand webmasters' needs.
Once you have created your Sitemap, you can add it to your Google Sitemaps account and update it as your site changes.
[b]Licensed Sites: 16 (not including my own or ubbdev)
Those are kind of out of scope I think, it's designed to just pull the threads themselves, not forum topic indexes (which will likely be in a future build).
I'd have to see how you're calling these extra pages (header/footer) to be able to give a for sure answer.
I'm pretty sure that I could make an index page which could take all files in a directory and build a sitemap for you; but keep in mind that if it's just a custom header and footer that google won't really care; you're not feeding your content to them, you're simply feeding urls of your threads to them; the feed does not include any post content (as they don't want it).
Another product i'll be coming out with soon is a simple sitemap lister that you can manually enter urls on your site; but I'm working on a couple of other projects until I can get to that.
I've been toying with Curl for building automagic sitemaps, I have something that works for the most part, but you have to think that php has an execute script limit which would make it die prematurely on large sites (or forums for example).
Since I moved from 6.5 to 7.X Google has all but stopped crawling my threads/topics and as a result the numbber of new members I get has dropped dramatically! I hope this will help (should I decide to buy it)!
I Have recently moved my forums to a new Domain. So it was important to me to get my site under its new Domain name back into google as soon as i could.
Gizmo has been raving about the site map program he made, so it thought to my self (He seems like a nice chap) I think i will give it a go and see if it really does work.
The answer is YES
21 days after i installed the site map Google has listed my site under its new domain name with 6,000 + links. Not all my forums are public
It is fair to say what would have happened if i didn't have the site map installed? Well i cant answer that to be honest as I'm now listed. I like to think that it has helped in my situation where a Domain move for my forums.
The Site Map has worked for me, but if you have any questions about it functionality then please contact Gizmo via PM as i can only report back the results i have.
Gizmo's site map isn't free, but for my needs worth the money. You will also need a Google web masters account. It also works with Yahoo.
v0.3 has been completed this morning. I've done a lot of code cleanup, on both a security and code standpoint. Instead of "assuming" data passed to the script from the database is "valid" for feeds, we run subjects through htmlentities (thus validating code) and we force "int" on fields which should only contain numbers.
I've repaired a problem with the offsets option introduced in 0.2 which allows you to split your one large sitemap into several easier to manage sitemaps (thus allowing you the opportunity to not go over the 50k link limit that some services enforce.
I have also built onto the offsets option and built in the ability to utilize a sitemap index which will list any "valid" sitemaps on from your forum (built using this script only); everything will be done automatically, all you need to do is submit the sitemap index into google (or a search engine which support them) and when you tell them to look at your site, they will grab your list of maps and work on them accordingly.
I have also set the "offsets" option to off by default; when I introduced it in 0.2 I set it to "on" by default.
Coming in 0.4: I'm looking to add the ability to base "importance" to urls based on their date posted; this will allow the newer data to have a larger importance than the older data, thus allowing it to be crawled more often than older threads.
I'm also looking to allow the maximum thread inclusion to be adapted in the URL as well; however this ability will not be able to go over the scripts hardcoded 50k limit.
Also, if you've got a suggestion for expansion, please let me know, I'm always looking at new ideas.
If you're looking to upgrade, please email me (james[at]virtualnightclub[dot]net) with your licensed url's and I'll send off the new files.
If you're looking to purchase license please feel more than free to email me for more information.
Custom pages are out of scope for this (at least at this point), I COULD code it in but cost vs demand... There would have to be a way to add items into the map and since the current manner just auto-pulls data from the database there is no real "need" for the ability to add items (short of posting more content to your forums)
I do however have plans to write a separate sitemap utility for custom/standard pages; this script since i have to hop through hoops to cater to limitations (filesize and total links) it would be difficult to factor in custom pages, the next product will gear just towards your "standard site" items.
You could always try adding them to the sitemap manually (by hacking it in) so long as you're sure you won't bust over the 50k links/page limitation.
Q. Can I 'select' which forums I want to include on the sitemap? A. No, it takes any "public" forum and creates a map for posts in it.
There is a valid reason for this; either way google will eventually crawl all data, a sitemap just helps it along; either way the data will be there (sooner or later).
Q. How is your upgrade police for your program versions and for UBB versions A. Upgrades, at this point, are free of charge to valid license holders; I upgrade as the UBB upgrades (for example, as the page variable was redundant, it was also purged in 0.3). The upgrade pricing may change in the future, but at this point I have no plans of charging for upgrades. A. Upgrades are free for 1 year (365 days), after which you can purchase upgrades at $25/yr.
To receive an upgrade to the UBB.Sitemapper, all you need to is email me after there is one available and I'll email it back; however I do request that when emailing me you also email the site that you've purchased it for so i can do internal auditing (for a possible download manager in the future).
I've been putting some finishing touches on 0.5; the current changelog is:
-- 0.5 -- Removed "Priority" tag as it was optional and providing duplicate errors from Google. Removed some "test code" from the header of the script left in from 0.4. Fixed a bugglet with the "read from config" option which was added in 0.4. Added a "tick" to config which allows you to display the optional "changefreq" tag. Added the ability to pull the community name from your ubb config as the feed name. Updated the changelog format to fix spelling and grammatical errors introduced in 0.3.
If you've noticed an inconvenience or would like to see a change (or update) please let me know so I can get it in! Feel free to email me as well (james[at]virtualnightclub[dot]net)
Use the sitemap index option, it'll create an index of x amount of sitemaps, each map would be X size based on the paramater in the sitemap script...
EX: You have 100000 threads, you set 25k links in the sitemap, so the index would push out an index with 4 sitemaps, each sitemap would push an offset of 25k, so the first would be 0-25k, second 2501-50k, etc...
I'd recommend using no more than say 25k chunks however, makes the query a little easier to deal with lol...
Q1 -I've just moved my site to a new host, and its now under a new domain with the old domains redirecting to it (for now at least). However the new domain has 2 variants (a DOT com and a DOT co DOT uk) the later just points back to the former. Do i need 2 licenses or 1?
1. just install it for the primary; you really should have 1 domain with others directing to it for better results in engines (duplicate anything brings one down in points).
2. Yes, the script itself is completely separate from the UBB, as it only relies on standard data in the db it should be compatible with just about any install within v7.
Sweet Point taken on the redirects - i've just moved a load of stuff over from another hosting company and registrar so things are a little disorganized at the moment. Ultimately everything will be redirected to a single point.