Type in your suggestion - new feature or improvement idea

Add and manage xml sitemap for SEO

Bing, Yahoo! and Google all recommend a site map be submitted to their crawl services in order that URLs on a website are brought into their crawlers' awareness. Otherwise the bots generally rely on quantity of cross linking from other sites and for many of our members this is very hit or miss — resulting in many member page never making into SERPs.

We would greatly appreciate some mechanism for adding and managing XML site maps for all, or better yet, specific pages of a WA member site.

28 votes
Sign in
Sign in with: Facebook Google
Signed in as (Sign out)

We’ll send you updates on this idea

fruitbat shared this idea  ·   ·  Flag idea as inappropriate…  ·  Admin →
Evgeny Zaritovskiy responded  · 

Thanks to Barbara, here is current workaround:
Create a new folder under file mangement called sitemap.
Then upload your site map to this folder.
Sumbit your site map to Google, it will look something like this yoursite.com/resources/sitemap/sitemap.xml

A good place for free google sitemaps is:


Sign in
Sign in with: Facebook Google
Signed in as (Sign out)
  • Cedric Volk commented  ·   ·  Flag as inappropriate

    I can see the argument: Most Wild Apricot sites are basic sites (like mine) and are generally used in context of a small club environment where users may already know each other personally. So in that way, the website is just a tool to disseminate information to people who already belong to the group - right?

    In contrast, however, our site (myvillageoc.com) is not a only a hub for people already in the know - it is a tool that we use to capture new members, to attract new business. I would say that any organization - from non-profits to lots-of-profits-please have a constant need to infuse fresh blood and fresh cash into its efforts. The Red Cross, for example, can't exist without new donors daily.

    The XML Sitemap function contributes to the effort of being found. I imagine our Wild Apricot websites each as skyscraper office-buildings with thousands of suites inside; each with a gem of service or information inside. I picture the lobby of this building being visited by an outsider and to their dismay, they find neither office directory nor map to help them to the service or information that they need.

    In this vision our sites are the buildings, the visitor the search engine crawlers. I ask the development team at Wild Apricot if they've ever found themselves in an everyday situation like this: standing on a train platform or at a bus stop trying to get somewhere but nary a schedule or route plan to be found. In that moment, wouldn't you go straight to your mobile device and Google the solution to your problem inside of two minutes?

    If the train or bus service made their sites invisible to search engines, you might not get an answer on the time for the next voyage - or perhaps the more important message that the station you're at has been removed from all route plans; the train will never come.

  • margaret commented  ·   ·  Flag as inappropriate

    I'm staggered WA are suggesting this as a solution.
    As has been stated earlier, this is a core SEO function.
    Asking people to manually update the sitemap (esp given the number of non-tech people that manage these sites) is ludicrous if your site is *not* static (eg a membership site), or if it's active with posting.
    The decision to not address this shortcoming in the website software hurts every WA website.
    Pity given WA has been proactive with https.

  • barbara commented  ·   ·  Flag as inappropriate

    As suggested by ooyoi.com, the following work around WORKS!!! Yay! Maybe WA can let their tech support know about this too

    Create a new folder under file mangement called sitemap.
    Then upload your site map to this folder.
    Sumbit your site map to Google, it will look something like this http://yoursite.com/resources/sitemap/sitemap.xml

    A good place for free google sitemaps is:

  • Anonymous commented  ·   ·  Flag as inappropriate

    This is a very important feature for improving the SEO on my site. All SEO analyses of my site tell me that I should have this feature.

  • Anonymous commented  ·   ·  Flag as inappropriate

    This is a basic for websites now......ALL websites do this as a matter of course, and WildApricot should too.

  • Anonymous commented  ·   ·  Flag as inappropriate

    ** Critically important for all us SEO gurus. **

    - Only need to create the sitemap.xml file for pages OUTSIDE the website, not the internal pages.

    - This will reduce the load on the WA servers.

    Can it be added to the next release please ?

  • Schneibs commented  ·   ·  Flag as inappropriate

    This is critically important for the organization I represent as well. For our purposes a single XML sitemap for the site would meet our requirements.

    I am watching a scary traffic slide since switching to WA, and losing the XML sitemap. Do wish I had realized in advance.

  • monte russell commented  ·   ·  Flag as inappropriate

    One thing you could do while waiting for a real sitemap is to use Google's Web Master Tools to "fetch" the pages you want indexed.

    It is under the web site then in the left column go to "Crawl" select "Fetch as Google" in the url box copy and paste the complete url for the page you want to index.

    It is a little tedious if you have a lot of pages you need indexed...

    Hope this helps.

  • Dmitry Buterin commented  ·   ·  Flag as inappropriate

    Unfortunately there is no 'root directory' as such - all sites are generated on the fly from one big engine. So unfortunately right now there is no way to put up this file.

  • malarkist commented  ·   ·  Flag as inappropriate

    That's my understanding as well Dmitry. For this reason we need to be able to put a sitemap file in the root directory. But Wild Apricot limits us to only uploading files under the Resources folder.

    Perhaps a simple workaround until the capability is built in - Could we send a sitemap.xml file to you so you could upload it to our root directory? Any reason this would not work?

  • Dmitry Buterin commented  ·   ·  Flag as inappropriate

    Actually I do not think it will work this way :-(

    My understanding that Google will only index pages which are on the same URL as the sitemap (for security reasons). So if sitemap is in resources, only links under resources would be indexed.

  • malarkist commented  ·   ·  Flag as inappropriate

    did you verify in your webmaster tools that the urls are being recognized?

    i submitted my sitemap to google and get an error on every url of my site that 'This url is not allowed for a Sitemap at this location'.

  • Gary C commented  ·   ·  Flag as inappropriate

    Night generated sitemap.xml file. I read many people looking for sitemap.xml functionality, and I would like to reiterate how important this is. I know it is possible to manually run a tool that scrapes the web pages, but that's manual and doesn't have the data that you have in the database. Instead, I would love to see a nightly task that builds the sitemap.xml by evaluating the change date of static content coupled with the dynamic content from gadgets. For instance, if a page had a gadget that is displaying a new discussion post, that page should be dated at least to when that post was made. The sitemap.xml file should (appear to) be in the root so search engines automatically find it. You could have an ISAPI filter or HTTP module that redirects domain.com/sitemap.xml based on the domain. There should also be a robots.txt file.

    Dynamic content is currently linked with query string parameters. All pages should contain the header text in the url. For instance, instead of this..


    It should, perhaps, be something like this which is much more search engine friendly. Note that the relevant Ids are at the end so the WA program can extract and use.


Feedback and Knowledge Base