Highlighted
Trusted Contributor.. Trusted Contributor..
Trusted Contributor..
939 views

Improvement to Webdrawer Security?

Hi,

We are looking for  ways to improve the security of  our webdrawer implementation. We found that load on webdrawer can directly affect performance of main workgroup servers, which in turn affect performance of production RM environment.

What are some way we can improve the security for it? Should we apply SSL to webdrawer?

We are using HPRM 8.3.0.9429.

Thank you,

Tom

0 Likes
6 Replies
Highlighted
Micro Focus Expert
Micro Focus Expert

I am not sure I see the link between security and performance.  Do you want to reduce the number of users and so improve performance?

One way to reduce the impact is to run a dedicated WorkGroup server for WebDrawer.


Blog | Samples | CM SDK Docs
**Any opinions expressed in this forum are my own personal opinion and should not be interpreted as an official statement on behalf of MicroFocus**
0 Likes
Highlighted
Trusted Contributor.. Trusted Contributor..
Trusted Contributor..

David,

We have a dedicated workgroup server for the webdrawer but the impact is still felt accross the environment. someone was crawling the webdrawer and download all available electronic attachment. We are looking for options to not stop them but simply slow them down.

also we are also looking for general security measures to improve webdrawer overall not to simply reduce the number of users. Do that make sense?

0 Likes
Highlighted
Outstanding Contributor.
Outstanding Contributor.

Hi,

Your scenario is common across organsiations applicable for any reporting. Considering WebDrawer a read only access to the users, to gain performance without impacting production primary users. You may consider standing up a secondary server for Dataset and a workgroup itself (consider it a realtime secondary env with one Workgroup only) webdrawer can use this env which users can use and run any search etc. The records sync from prodction can be in realtime (usual for data replication).

second option could be to replace Webdrawer with some Portal that host such records available from web drawer. (If you have some selected records available through web drawers or open search to search anything?)


Cheers,
Harry
Highlighted
Micro Focus Expert
Micro Focus Expert

A couple of things that can be done today:

  • use the hptrim.config file to filter WebDrawer searches so that your WebDrawer instance will only support searching over a subset of your total dataset
  • use the WGS document cache to pre fetch some (or all) of the documents in your document store so they are not being pulled live from the store when the user requests them

Something else we could do as an enhancement (or even potentially a customisation on an existing install) is to do something to obfuscate the Record Uri so that it is more difficult to trawl the entire dataset simply by incrementing the Uri.  We did something like this in a previous module.

Maybe, if the problem is electronic attachments, we could implement something to force authentication before download, for example you could require people authenticate with their Gmail or Facebook credentials.  Although with WebDrawer that might require some work as its behaviour is usually all or nothing in regards to authentication.

 

If you have a specific request in the security space I am happy to hear what it is.


Blog | Samples | CM SDK Docs
**Any opinions expressed in this forum are my own personal opinion and should not be interpreted as an official statement on behalf of MicroFocus**
Highlighted
Outstanding Contributor.
Outstanding Contributor.

You should have a WAF sitting in front of webdrawer, this will help eliminate automated crawlers - you could also do rate limiting.

In the end, protecting services exposed on the internet isn't a solved problem...

Highlighted
Trusted Contributor.. Trusted Contributor..
Trusted Contributor..

You could also look at linking your application up to use a Captcha or something similar. I know you mentioned not wanting to stop crawlers, but it might prove useful in the long run.

Another solution could be to set a crawl delay on your machine. I stumbled across this page which outlines methods on how to achieve it: https://www.inmotionhosting.com/support/website/restricting-bots/how-to-stop-search-engines-from-crawling-your-website 

Rob

** My views are mine and are in no way attributed to those of my employer **
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.