Ability to export backup of the whole account
Wild Apricot automatically backs up all the datain all accounts daily. However, this is intended for emergency serverproblems only and these backups are not accessible to clients.Restoring a specific page or set of records on client request is a timeconsuming operation (which we currently do not charge for).
Thereare also export operations for all the key data sets (members, donors,event attendees, payments) but these do not cover settings and webpages and have to be run manually one by one.
Ability for clients to initiate a full backup of their account at any time.
1)The key issue is to generate backup which can be browsed in some way,for example to manually retrieve some records for manual emergencyrestore - or to transfer the data into another system.
2) ideallythis backup can be imported back into the account. However thisincreases the volume of work required by a huge margin.
Not yet. But in our next release (5.0) we will include new tools that will allow you to save historical copies of any pages.
Is this what you're looking for?
If not please elaborate on the compliance requirements.
Has there been any progress on this?
Backups are not just a matter of restoring after a problem, they are for compliance reasons as well.
FYI - Website Backup tool.
It won't create a backup that you can easily restore from but if you just want a copy of all your publicly visible pages/images you could use a tool like HTTrack Website Copier http://www.httrack.com
I am thinking that it's better to build purpose-specific functions. Very few of our users would be able to use zipped HTML of all pages to do things like you have described. Your suggestions make good sense and I would appreciate if you could post them as new threads.
Tom Gibson commented
It is not just for backups. If I could download a zip with just the page data html I can use it for audits. For example: find broken links. Find orphan pages. List all external links. These are easy scripts to write (I use bash on linux) and run on the desktop, not the server. Downloading one page's html at a time defeats this whole strategy.
Today's task is to find links that still point back to our old web site. It will take a week instead of an hour without this download.
Felix Figuereo commented
It would be nice to have the ability to export all custom/global JS and site content with the push/click of one button for purpose safeguarding custom code and design and content.
I think you are doing quite enough and I am confident that backup systems we have in place are sufficient for disaster recovery situations.
Richard Pouzar commented
In reading these comments and thinking through various scenarios, it seems that it would be difficult for you to offer backup guidelines that would cover all situations and all organization types. So, here's our situation and I'll ask you to comment on what backup holes we need to plug.
We've got about 200 webpages using membership renewals, applications, event registration as well as discussion forums. We keep backups of the member and contact database, all uploaded documents and graphics. We currently do not export our invoices to a backup. We have NOT taken the tedious step of backing up the source code of the 200 webpages, but changes go through a single individual which might mitigate some risk of loss. We're assuming that Wild Apricot can recover from any hardware/software malfunction without any loss to our systems. We're hoping that in the unlikely event that some hacker recreationally destroyed our website that Wild Apricot could reconstruct it (?).
Our website manager uses a Mac so WebStripper (or PageNest) is not an option. Given the Chief Apricot's previous comments, I'm not sure WebStripper is viable as a source from which to upload the website anyway.
What additionally should we be doing?
Thanks Gary. Your suggestions align well with our own thinking.
We do not expect to implement restoring from backup -at least for a while - though because the software evolves all the time and maintaining backwards compatibility will be a challenge.
Gary C commented
I'd love to see it implemented something like this, with individual modules.
Files.zip: The entire resources folder is captured in a ZIP file with the directory tree.
Configuration.xml: All configuration data is stored in a well documented, schematized XML document.
PageData.xml: The configuration and content for all pages is stored in a well documented, schematized XML document. Each page would be in a Page element with attributes for the type, meta tags, name, path, etc. a child element would exist for each of the content blocks.
UserData.xml: All user information is available in a well documented, schematized XML document.
Others for forum data, events, finances, etc.
All of the module files would be captured into a Backup-YYYY-MM-DD.zip file. An entire backup file could be selected and restored. However, a single module could be selected and restored. e.g. just page data. This would also provide a nice way to bulk import data. I imagine this would make it easier to move from other CMS systems to WA. Tools could be written to transform backups from other systems to WA's schema.
Gary C commented
I agree with most of the people on this thread. An ability to backup and export the data is critical and should be top on the list. You mentioned that part of the complexity is due to incompatibility between versions of the site. Even if the standard operating procedure was to have all customers backup again right after a new release, that would be reasonable.
Sorry, this is not possible. Our system is not a typical 'dumb' hosting where you just upload your page created with any software - it's software as a service, with built-in CMS so pages can only be created/edited with integrated web editor.
So, your saying there is no FTP option? What if i wanted to design web pages with Dreamweaver or another software it's not possible to upload to the site after?
Sorry, this is not possible with Wild Apricot platform.
We wish to be able to back up web pages or whole website. Is it possible to use dreamweaver and connect via FTP to download the whole website, make changes, and upload page? This would be a form of "back-up" for the site. If this is possible please let me know as using the html editor in the W.A. system only allows for a few options and I wish to use Dreamweaver for editing web pages and the back-up of whole website on a local PC.
Thanks for sharing!
It doesn't always work exactly right but I've used Web Stripper to pull down entire websites. Database pages get converted to html. http://www.webstripper.net Also Adobe Acrobat has a feature to convert & save all pages in a website to PDF.
Cathie Izor commented
Thanks, Dimitry, for the tip on manually saving pages. That gives us an idea on what to do for critical pages.
Our organization is a brand-new customer, though, and the lack of user backup was our biggest concern and made us think very hard about whether we should sign up or not. We LOVE just about everything about the system except for two things: 1) not having e-mail addys with our domain name without going through the hassle of Google setup, and 2) much more important, not having user backup. I can work around the e-mail situation but the lack of backup control makes me nervous.
Please move user backup as high as possible on the wish list and put effort into it. It is very important to users.
P.S. If we hadn't been able to easily export the member and contact databases, we would not have signed up for the system. I am using that as our backup for that information because that is our lifeblood.
Right now, the only way I can think of is to manually via your browser (Save page as...) - for each page.
We're a professional association of archivists, so for me, backup is not just for emergency webpage recovery, but long term preservation of web page content and visual appearance - most or all versions of each page. With older systems, I just saved the HTML files, graphics, etc. and added dates to the file names.
Are there any workarounds that would enable me to preserve this data (locally - not just on WA), without eating up more time than I have to devote to web page maintenance? WA was supposed to make my tasks easier, not more complicated. (I didn't realize it would take away my preservation capabilities). Please advise. Thanks!