Restricted access to documents and images (member only files)
If users have a direct link to a file, they will have the ability to download it (i.e. page permissions are not inherited by the files on those pages)
Restrict folders and the files in those folders by membership/administrative/group level
We are excited to announce the launch of the “Restricted access for folders” feature. It is currently available for use and we are closing the wishlist thread. Thank you for your patience and feedback that help us to make this feature most useful for you.
Here’s how it works:
- You now have the ability to set permissions on folders. All files and subfolders inside are automatically assigned the new access level.
- When a new file is uploaded into this folder or sub-folder, it is automatically assigned the same level of access.
- Within the restricted folder, the administrator is still able to change the access level for individual files and subfolders.
- The new functionality is also available in the Insert resource dialog.
Some comments - mainly along the same lines as others:
1. We would also need the ability to set permissions on folders, and have all the files and folders in the folder inherit the parent permissions (unless different permissions were specifically set in the child files or folders).
2. It would also be useful to have a default, so that everything created/uploaded defaulted to having specific access.
3. After opening a file without being logged in, we get the login prompt (which is correct), however after logging in it does not show the file, but goes to the home page.
Got all excited that this feature was implemented, until I saw that it was only available for files and not folders, which makes it unusable for our purposes.
+1 for folder-level access (which I see is in the works eventually).
Hi, supporters. Thank you all for your feedback.
We have plans to make ability to set access for folders as next step. Now we are working on this feature https://forums.wildapricot.com/forums/308932-wishlist/suggestions/8825731-uploading-documents-and-photos-in-forum-and-blog-p and as soon as we are done with this, we will deal with next step in feature “Restricted file access”.
Answering your question,
Paulp575123, for each page on your Wild Apricot site, you can control whether it appears in your site menu, and the type of users who can access it. You can read more about it on help page: https://help.wildapricot.com/display/DOC/Page+access+and+visibility?from=YIblB
Skip Reddy commented
My feedback on the current state: Allow application of filters at the directory/folder level, with each sub-object (subdirectories or file) inheriting the permissions of the parent by default.
I also tested bookmarking the document URL. Restriction worked perfectly. Was prompted to log in each time. Again, the home page displayed instead of loading the document.
Not logging out must have been the problem. When testing last night, I was logged into WA (admin) and three other personal accounts. At that time, the document could be accessed from all email accounts.
This morning I separately logged into the other accounts and clicked the document URL I had sent for testing: WA prompted with log-in page for all accounts. The home page displayed after logging in instead of displaying the document. My guess is that because the URL is to a file instead of a page, the system doesn't have a reference where the link is.
Dmitry aka Lanky Apricot commented
I could not reproduce the bug which you described.
Maybe you forgot to log out from the website before open the link?
Please elaborate the steps.
My testing was not as robust as I would have liked because we have not imported our membership yet. However, I discovered that if I pasted the URL of a restricted document (in this case a PDF) in email and sent to someone without a profile in the organization or to someone in the general public, the document could be opened. I expected the action to display an alert box stating user must log in to view. This needs to be fixed.
I echo other comments about restricting access at folder levels. But the extra column in Files view mode showing Access Level is helpful. It's also nice to be able to select/deselect multiple files at a time.
Charles Pena commented
Tested this file restricting feature, appeared to work as designed. Would have preferred to restrict a folder but since you can select multiple files/pics to restrict within a folder, it appears to accomplish the same objective. As in most aspects of the WA would prefer more Administration levels to choose from but since you allow groups, this will allow us to create more specific access groups. Not sure if it is part of this improvement but the ability to restrict download rights would be more useful. We might want all members and maybe even the public to see a picture but not want them to download it. But as far as restricting access, it appears to handle this issue.
Mike Felton commented
The file access control process and interface is working. (only small complaint is that the popup box displays too low and on laptop screens the SAVE button is out of sight).
As others have mentioned it would be great to have the FOLDER access function added in due course.
Received an e-mail that I am a tester for this functionality.
Everything I see on the Help page says it's for individual files.
Looking through the comments here, I see it was also been requested for folders on the Files page.
Also, what about doing this for specific pages - or is this already available and I'm missing it?
Just restricted access to our site files . . . . GREAT FEATURE. It's easy to use, especially being able to select multiple files using the various browser options such as 'space bar the 1st & last file'. CAVEAT: Wish you could allow restricting a FOLDER which would automatically restrict contained files. This would require a restrictive option such as 'All Files in Folder Will Have The Same File Access Restrictions'. That would make it easier to restrict files in a folder if they are all the same restriction. I have a folder with sub folders & files and all have the same access restrictions. Took lots of time to open each sub folder and select the files and restrict access. Would have been much easier to restrict each subfolder, or even easier to restrict the top level folder.
Kim Skimmons commented
I agree with: "We need a way to restrict access to files, folders, and documents so that they can only be accessed and downloaded by people who are logged in current members."
The only real solution I see for documents that *must* be protected is to upload them to a different cloud service that has actual folder- and file-level protection options (e.g., Google Drive, box.net) then share the password-required link on WA pages/emails (just don't publish the password on the WA page/email too!).
Until WA provides a solution, don't put any documents, images or photos on WA servers that you absolutely don't want non-members to be able to access.
Bill Coulam commented
Most hosting providers give you a linux-like interface to the folders and files used by a website, in which I'm able to set read/write permissions on both. WA doesn't seem to offer that for folders and files in the Website | Files applet. Could you please add that feature? We'd like those with admin privs to be able to get to certain folders through this applet, but no let web bots and crawlers and enterprising individuals find them and download them.
I am looking forward to test this functionality as it is very important for us.
Ok I was told that this was for File Sharing but from what I read it's all about the ability to download files that are put on by the admin. What I need is a folder to be able to upload pdf files to so that others can download them. This will be for reports that will be used in a mtg.
Randall Rensch commented
I'm told that WA programmers are working on this issue and it is indeed considered an important one. Am looking forward to a simple secure solution, as our own users' confidence is an important to our group relationship as WA's user confidence is to the business relationship.
Meanwhile, I have learned a bit more and it might at least help others repair the damage. From here on, I hope that such guidance will come from Wild Apricot, not requiring our own prospecting.
It's likely that Google found its way to our URLs via a page that had been retired but for some reason no longer secure. (Those club members involved are involved in such things no longer). Expedited by Support's guidance, we made the page secure, and renamed it so that it will no longer be found. However, simply renaming a page isn't a solution, as the WA system automatically sets up a redirect.
Support says it’s because the system always remembers any page’s original unique base URL (e.g., “page-1234567”). Here’s the workaround whenever a redirect is NOT wanted:
1. After renaming a page, make a copy of it (Duplicate Page).
2. Delete the original page
3. Delete the word “copy of” (or whatever” from the new page’s Title and URL, so that both are same as the original.
If the page is restricted to Members or Admins, you're probably okay now. But for good measure now that you've also hidden the page, and are sure there are no links to it (be sure it's not on any menu!), advise users not to go to the page by typing its URL into a search engine! URLs belong on a browser's Address line only! (This is yet another reason why security based soley on an obscure URL is no security at all.)
So now, to undo the damage -- how to remove pages from Google Search Engine Result Pages and cache display. (Again, this is only Google -- an unsecure document may have been cataloged by any number of search engines and Internet archive sites.)
Pardon me if I save a few minutes by pasting from our own internal advisory, which will echo some advice already given in this thread:
Go to: https://www.google.com/webmasters/tools/url-removal
You'll need to be authenticated as a webmaster for the domain. There are various ways to do this, the easiest being to log in at your domain name registrar via Google's link. Some of the other methods (e.g. uploading a .txt file to root) are apparently not possible at WA. (Google will immediately show that you have been authenticated, but allow a short time for your authentication to be displayed among Google's selection list.)
Click "Temporarily hide" and enter the full URL. For example: http://www.yourdomainname.com/Resources/MemberContent/SecureFolder/Securesubfolder/sensitivedoc.pdf
Click Continue, then Submit Request.
Google will remove the doc from its own Search Engine Results Pages (SERPs), but still might retain its content in its database. Never put anything online that you absolutely would not want to become public. (Eg., your bank password or the recipe for Coca-Cola)
To determine the page's exact, full URL if not known: Copy from the browser's address tab. Sometimes a PDF fill will instead open in Acrobat or ask that you save it to your drive. In that case, the URL isn't displayed. Some ways to find it (one or another should work):
Find the page in Google's results. For example search for site:yourdomainname.com pdf
Copy the displayed URL. If the URL is too long to be fully displayed, it will containt "/.../" and you cannot use this method.
So instead, click on the displayed link (right click to open in a new tab or window). If the document/page opens in a browser (or if not found, but the URL is displayed), copy the URL.
If the document wants to be opened in other than your browser, click on the down-carrot arrow in the listing. Choose "Cache". Copy the source URL from the top of the cache-display page.
This removal request is good for only 90 days. Within that time, you'll need to rename the document(s) so that any incoming links are broken, or remove the document(s) from the server entirely. Note that the server must return a "Page Not Found" Error 404 to Google, or Google might again assume it is technically online and might display the cached version.
There are ways to prevent a page from being spidered, but many of them are purely voluntary, and some spiders and bad guys will ignore them. Also, some of those methods are not available to us at Wild Apricot, or cannot be implemented with a non-HTML file (e.g, a PDF).
This is only for Google. If found in other search engines, they may have their own procedures, maybe not.
I'm almost embarrassed that I didn't find Google's Removal page much sooner. It seems pretty easy now that I know. But then, Einstein's non-mathematical thought experiments are pretty simple, too, once you know about them. Anyway, once discovered, the process is not so hard.
But prevention is way easier.
This requirement, like user-accessible backup/restore, is table stakes for any business offering web services for money.
So, a requirements spec is quite simple: look at any other site (ie Dropbox) and implement what they have. Quite simply put, give public/members and administrators appropriate access (no access, read,or read/write/create access) to each file.
With this number of votes, why does this requirement continue to fall to the bottom of the stack?
I echo all of Randall's comments and concerns. Over the past 20 years I've worked on various publishing systems: they all have a built-in security system or functionality to create one. Apache servers use a .htaccess file for access. I am not a server admin and know nothing about WA infrastructure, but if WA is using Apache servers, it seems this could be a possibility.
Hacks, such as using Google Docs, Dropbox, etc, are not acceptable. These fragment our editing/publishing process and make it more difficult.
Our organization has sensitive information we want to keep hidden from the public. Currently, our content is in WA and sensitive files are hosted on an Apache file server to preserve security. The additional server costs extra money. Not ideal. I hope WA comes up with a solution soon.
Randall Rensch commented
I know nothing (well, not enough to be useful) about servers; I've always specified Apache for the shared-server sites I've run. But I'm kinda getting the impression that WA is running your own proprietary OS or something. I can't imagine file security not being built into any major server OS.
Anyway, maybe there's a workaround for now ... is there a way we can encrypt our PDF files before or after uploading, and have your system decrypt them based on a simple password? That might not keep out the Kremlin, but would keep spiders from publishing anything useful.
I wouldn't expect our people to start encrypting files before uploading them, expecting Members to download and decrypt them, but some browsers (or your system?) insist that PDF files be downloaded before viewing anyway (bummer!), so this might also be an option. Pretty clunky, but better than distributing our financial data and such publicly. Suggestions as to third-party software for this?
Note however, we also store some PDF files that we WANT people to see. For example the details of attending a certain event or excursion.