About: This addon takes just moments to install, it gathers your forum topic links and exports them as a sitemap that you can add to Google, yahoo, or just about any search engine which utilizes sitemaps.
The configuration options allow you to specify which type of sitemap you wish to use and weather you wish to plugin threaded or flat view. The sitemap type and view type can both be set either hard coded or via url string.
Simply enter the URL to your sitemap install into your Google sitemap list (here) and google will automatically grab copies of your thread links every week.
What are the security aspects?: Well, the sitemap will ignore threads in private forums (if a guest cannot *see* them then the sitemap will not publish the link). Also, threads that are not approved will not show either (same aspect as above).
What is a sitemap?: A sitemap will take and provide all of your current threads to search engines, thus eliminating the wait of "new discovery" of new pages on your forums.
You can read about the benefits of using sitemaps via the below quoted article from Google Webmaster Tools.
Please note that this addon does not create a sitemap for your entire site, and only creates one for your forum.
Demo: As this addon doesn't utilize a cache, no active demo is available, however you can see example output via the attachment screen shots.
Install Instructions: The install is quite simple, after receiving the software via email simply open the php file and place your MySQL connect information where indicated. Now upload the file to your web root (not your forum directory) and then simply open the script via the web and you'll see the default XML (google valid) sitemap.
Usage Instructions: By default, the script uses flat mode and the XML map. These defaults can be easily changed from within the script itself.
As for web-based switches: http://www.example.com/sitemap.php?type=&view=&se= // Optional Paramater -- se=# Where # weather or not to use SE Friendly URLs (0/No, 1/Yes) // Optional Paramater -- type=# Where # is a map type (1/XML, 2/Text, 3/html, 4/RSS, 5/ASP, 6/SitemapIndex) // Optional Paramater -- view=# where # is your view type (1/ShowFlat, 2/ShowThreaded) // Optional Paramater -- offsets=# where # is weather or not to use offsets.
Submissions: Not all search engines will automagically re-check your sitemap; in fact, I've only noticed that Google will re-check on its own. You'll want to go to Yahoo or anyone else every couple of weeks and tell them to re-grab a copy of your map.
Attachment Information: sitemap-xml.jpg - An XML map. sitemap-rss.jpg - An RSS map. sitemap-text.jpg - A simple "text link list" map. sitemap-link.jpg - A simple "html link list" map. sitemap-asp.jpg - An ASP style sitemap. sitemap-index.jpg - A screen cap of the Sitemap Index feature.
Pricing: $50 per install (each separate install requires a separate license) which includes 1 year of free updates.
How to Buy: You can order via our Script Information page here (there is an order form towards the bottom for PayPal payments).
Notice: This addon is not freeware, and as such you cannot freely provide it to others without prior written permission from its author(s). Permission was obtained from the UBB Dev administration to post this non-free addon's information in advance.
A Sitemap file lets you tell us about all the pages on your site, and optionally, information about those pages, such as which are most important and how often they change. By submitting a Sitemap file, you can take control of the first part of the crawling/indexing processes: our discovery of the pages.
This may be particularly helpful if your site has dynamic content, pages that aren't easily discovered by following links, or if your site is new and has few links to it.
Sitemaps help speed up the discovery of your pages, which is an important first step in crawling and indexing your pages, but there are many other factors that influence the crawling/indexing processes. Sitemaps lets you tell us information about your pages (which ones you think are most important, how often the pages change), so you can have a voice in these subsequent steps. Other factors include how many sites link to you, if your content is unique and relevant, if we can crawl the pages successfully, and everything outlined in our webmaster guidelines.
A Sitemap provides an additional view into your site (just as your home page and HTML site map do). This program does not replace our normal methods of crawling the web. Google still searches and indexes your sites the same way it has done in the past whether or not you use this program. Sites are never penalized for using this service. This is a beta program, so we cannot make any predictions or guarantees about when or if your URLs will be crawled or added to our index. Over time, we expect both coverage and time-to-index to improve as we refine our processes and better understand webmasters' needs.
Once you have created your Sitemap, you can add it to your Google Sitemaps account and update it as your site changes.
[b]Licensed Sites: 16 (not including my own or ubbdev)
Those are kind of out of scope I think, it's designed to just pull the threads themselves, not forum topic indexes (which will likely be in a future build).
I'd have to see how you're calling these extra pages (header/footer) to be able to give a for sure answer.
I'm pretty sure that I could make an index page which could take all files in a directory and build a sitemap for you; but keep in mind that if it's just a custom header and footer that google won't really care; you're not feeding your content to them, you're simply feeding urls of your threads to them; the feed does not include any post content (as they don't want it).
Another product i'll be coming out with soon is a simple sitemap lister that you can manually enter urls on your site; but I'm working on a couple of other projects until I can get to that.
I've been toying with Curl for building automagic sitemaps, I have something that works for the most part, but you have to think that php has an execute script limit which would make it die prematurely on large sites (or forums for example).
Since I moved from 6.5 to 7.X Google has all but stopped crawling my threads/topics and as a result the numbber of new members I get has dropped dramatically! I hope this will help (should I decide to buy it)!
I Have recently moved my forums to a new Domain. So it was important to me to get my site under its new Domain name back into google as soon as i could.
Gizmo has been raving about the site map program he made, so it thought to my self (He seems like a nice chap) I think i will give it a go and see if it really does work.
The answer is YES
21 days after i installed the site map Google has listed my site under its new domain name with 6,000 + links. Not all my forums are public
It is fair to say what would have happened if i didn't have the site map installed? Well i cant answer that to be honest as I'm now listed. I like to think that it has helped in my situation where a Domain move for my forums.
The Site Map has worked for me, but if you have any questions about it functionality then please contact Gizmo via PM as i can only report back the results i have.
Gizmo's site map isn't free, but for my needs worth the money. You will also need a Google web masters account. It also works with Yahoo.
v0.3 has been completed this morning. I've done a lot of code cleanup, on both a security and code standpoint. Instead of "assuming" data passed to the script from the database is "valid" for feeds, we run subjects through htmlentities (thus validating code) and we force "int" on fields which should only contain numbers.
I've repaired a problem with the offsets option introduced in 0.2 which allows you to split your one large sitemap into several easier to manage sitemaps (thus allowing you the opportunity to not go over the 50k link limit that some services enforce.
I have also built onto the offsets option and built in the ability to utilize a sitemap index which will list any "valid" sitemaps on from your forum (built using this script only); everything will be done automatically, all you need to do is submit the sitemap index into google (or a search engine which support them) and when you tell them to look at your site, they will grab your list of maps and work on them accordingly.
I have also set the "offsets" option to off by default; when I introduced it in 0.2 I set it to "on" by default.
Coming in 0.4: I'm looking to add the ability to base "importance" to urls based on their date posted; this will allow the newer data to have a larger importance than the older data, thus allowing it to be crawled more often than older threads.
I'm also looking to allow the maximum thread inclusion to be adapted in the URL as well; however this ability will not be able to go over the scripts hardcoded 50k limit.
Also, if you've got a suggestion for expansion, please let me know, I'm always looking at new ideas.
If you're looking to upgrade, please email me (james[at]virtualnightclub[dot]net) with your licensed url's and I'll send off the new files.
If you're looking to purchase license please feel more than free to email me for more information.
Custom pages are out of scope for this (at least at this point), I COULD code it in but cost vs demand... There would have to be a way to add items into the map and since the current manner just auto-pulls data from the database there is no real "need" for the ability to add items (short of posting more content to your forums)
I do however have plans to write a separate sitemap utility for custom/standard pages; this script since i have to hop through hoops to cater to limitations (filesize and total links) it would be difficult to factor in custom pages, the next product will gear just towards your "standard site" items.
You could always try adding them to the sitemap manually (by hacking it in) so long as you're sure you won't bust over the 50k links/page limitation.
Q. Can I 'select' which forums I want to include on the sitemap? A. No, it takes any "public" forum and creates a map for posts in it.
There is a valid reason for this; either way google will eventually crawl all data, a sitemap just helps it along; either way the data will be there (sooner or later).
Q. How is your upgrade police for your program versions and for UBB versions A. Upgrades, at this point, are free of charge to valid license holders; I upgrade as the UBB upgrades (for example, as the page variable was redundant, it was also purged in 0.3). The upgrade pricing may change in the future, but at this point I have no plans of charging for upgrades. A. Upgrades are free for 1 year (365 days), after which you can purchase upgrades at $25/yr.
To receive an upgrade to the UBB.Sitemapper, all you need to is email me after there is one available and I'll email it back; however I do request that when emailing me you also email the site that you've purchased it for so i can do internal auditing (for a possible download manager in the future).
I've been putting some finishing touches on 0.5; the current changelog is:
-- 0.5 -- Removed "Priority" tag as it was optional and providing duplicate errors from Google. Removed some "test code" from the header of the script left in from 0.4. Fixed a bugglet with the "read from config" option which was added in 0.4. Added a "tick" to config which allows you to display the optional "changefreq" tag. Added the ability to pull the community name from your ubb config as the feed name. Updated the changelog format to fix spelling and grammatical errors introduced in 0.3.
If you've noticed an inconvenience or would like to see a change (or update) please let me know so I can get it in! Feel free to email me as well (james[at]virtualnightclub[dot]net)
Use the sitemap index option, it'll create an index of x amount of sitemaps, each map would be X size based on the paramater in the sitemap script...
EX: You have 100000 threads, you set 25k links in the sitemap, so the index would push out an index with 4 sitemaps, each sitemap would push an offset of 25k, so the first would be 0-25k, second 2501-50k, etc...
I'd recommend using no more than say 25k chunks however, makes the query a little easier to deal with lol...
Q1 -I've just moved my site to a new host, and its now under a new domain with the old domains redirecting to it (for now at least). However the new domain has 2 variants (a DOT com and a DOT co DOT uk) the later just points back to the former. Do i need 2 licenses or 1?
1. just install it for the primary; you really should have 1 domain with others directing to it for better results in engines (duplicate anything brings one down in points).
2. Yes, the script itself is completely separate from the UBB, as it only relies on standard data in the db it should be compatible with just about any install within v7.
Sweet Point taken on the redirects - i've just moved a load of stuff over from another hosting company and registrar so things are a little disorganized at the moment. Ultimately everything will be redirected to a single point.
Removed "Priority" tag as it was optional and providing duplicate errors from Google.
Removed some "test code" from the header of the script left in from 0.4.
Fixed a bugglet with the "read from config" option which was added in 0.4.
Added a "tick" to config which allows you to display the optional "changefreq" tag.
Added the ability to pull the community name from your ubb config as the feed name.
Updated the changelog format to fix spelling and grammatical errors introduced in 0.3.
If you need an update (and qualify for updates (within 1 year of purchasing your license) please feel free to email me at james[at]virtualnightclub[dot]net. If it's been past a year since you purchased the script, you can renew at here
Starting this week, I will be working on v0.6 of the UBB.Sitemaps script. There will be several updates to the script that I've wanted to push out for some time and these features will be relating to the speed at which the script works.
I'll be adding options for: 1. GZip - Compressing the output of the file so that it is loaded faster by clients (Search Engines in this case). 2. Caching - Using an in-house script and not the UBB's code (as this script works as a standalone for compatibility reasons we don't rely on UBB code within the script). By default this will be set to 24 hours. 3. Expires Header - This will be another method that will allow clients to retrieve content only when you specify it is going to be new. By default this will be set to 24 hours.
Some of these items will not function with all servers; your provider will need to allow you to: 1. Write flat files via the webserver; some of these files may end up belonging to your webserver user, but will overwrite themselves when needed. 2. Use GZip to compress files.
You'll likely need to play with settings to ensure compatibility but if you can use gzip within the UBB you should be able to use it within the sitemap script.
I'll also be policing for some general code cleanup opportunities within the script as well; I'd highly recommend an upgrade for the new caching options to all users as it'll increase the speed of the script exponentially (especially if you're using the "one big file" method vs the "sitemap index" method (default)).
I'm unsure as to a specific date that the new version will be available however I'll announce here when it's available.
Please also note, if it's been longer than a year since you purchased a license of the script, you'll need to renew your license; you can do so via the information page.
Going on track for testing of all systems for release tomorrow; going to go ahead and do a couple of final minute updates to the script tonight and package it up.
For those of you who've registered for the script within the last year (365 days) email me directly for an update (james[at]virtualnightclub[dot]net); for those of you who've registered past that please renew here and I'll ship off v0.6 with your payment verification.
Just packaged 0.6 up and sent off the first copies to those in good standing that have requested it.
I have the caching section working on my server if you'd like to try the preview URL (here) note that I've been having some general speed issues with my server as of late so don't mistake any speed issues as an issue with the script (it should actually parse instantly when serving a cached page).
You have it set to display in google and yahoo (or other sites), they send one request and the cache is built and the output is displayed. They send another request, and instead of rebuilding all data from the database (which can be intensive) it just reads out the cached file (until the cache expires).
Basically, makes it so unnecessary hits are dealt with in a manner as to not bog down system resources.
You won't be penalized anymore than you would if they crawled the UBB and found the URLs on their own; the script feeds them a URL and their crawler will hit pages that are linked off of that page itself to build its real index.
If you follow the example in the readme on what to give google it should actually build the URL the same as what you posted (as form link); the sitemap can be customized to build just about any format of url in just about any type of sitemap you'd want. The recommendation however is to build the sitemap as: ubb.sitemap.php?type=1&se=2&offset=0
Currently we don't have an option to build with SEO urls or the fake html extension; we just have se friendly urls and standard urls (which likely will change in the future).
The script primarily aids SE's by feeding content urls to the SE, it'll continue to crawl URLs as it finds them (specifically on links you feed to them), it'll build "new page" links and other links off of the pages your sitemap feeds into their systems.
What I'd like to see is possibly have the ubb gracefully forward URLs with a redirect (301/302 header) based on what URL schema the user chooses in the UBB we could eliminate duplicate urls alltogether (as any url accessed that wasn't on the default selected would send a redirect like what the redirector scripts do); but I'm unsure as to how easy/difficult that could be.
The needed $config array lines from the config.inc.php file are formatted into the $conf array; really it won't affect things at all, if the user is using the UBB.threads config file (which is by default) the script sorts everything as it's needed.
UBB.Sitemaps 8/28/2018 Developer Notes (Release Announcement) - No longer in Alpha as the script has been tested extensively on many server configurations. - Added a listing of all forum names (type 7), which is linked from the Sitemap Index page. - Added an option to not display the frequency line. - Added an option to not display the lastmod line. - Added an option to not display the .html extension for sitemap files. - Added support to always produce HTTPS (SSL) URLs (useful for sites on a CDN where the local server may still try to produce non-HTTPS links when using wildcard HTTPS urls). - Added support for including the UBB.threads configuration for sites running UBB.threads from the webroot (not in a folder). - Added support for specifying a domain (useful if your UBB.threads install is on a subdomain and you wish to have UBB.SiteMaps located on your main domain). - Added support for human readable permalinks with Apache Rewrite Rules - Removed the need for a trailing slash in the folder config. - Changed the MySQL database connection to utilize MySQLi for PHP7 support. - Minor code changes for script performance.