Adding Security (HTTP Response) Headers to IBM (Lotus) Domino Server to get an “A” Rating

IBM is doing its best to keep its Domino HTTP Server updated to accommodate all of the security patches and enhancements that are more readily available in competing web servers such as Apache and NGinx. One of these security enhancements is adding Security Headers or HTTP Response Headers to Domino’s HTTP server.

The main headers needed currently are:

Starting with Domino 9.0.1 Feature Pack 6 (or FP6) you can now add the majority of the above headers directly to Domino using both Notes.ini settings and an Internet Site Document. The IBM stated limitation is that a maximum of 4 Headers can be accommodated by Domino since you can add only one header using the Notes.ini and 3 to the Internet Site document. However, we’ve found that the limit is actually 5 since the “Strict Transport Security” header can be added as a separate Notes.ini setting.

For the Internet Site document you’ll have to pick and choose which headers you want to add since you do have that 3 header limit. Based on our testing and experience, we’ve found that the one header that causes the most problems to your site after applying it is the “Content Security Policy”. For sites with complex applications running X-Pages and other scripting technologies, this latter header can cause certain functions to break, so we decided to leave that one out of the 5 that we added. Your experience may be different and you may decide that you want that one added instead of another header.

You can add the following entries to the Notes.ini to support Strict-Transport-Security and X-Frame-Options (after saving your Notes.ini you will need to restart your Domino server for these to take effect):
HTTPAdditionalRespHeader=X-Frame-Options: SAMEORIGIN

To add the following 3 headers you can do so using the Internet Site Document and creating an Website HTTP Response Header Rule:

If you don’t already use an Internet Site document created for your web server then you will need to create one and change the settings on the server document in the Basics tab to point to the newly created Internet Site document (instructions on how to create an Internet Site document are beyond the scope of this post).  If you’ve created a new Internet Site document or already have an existing one then the next step is to create an HTTP Response Header Rule (HTTP responses headers is a “type of rule” for a Web Site Rule document) which is under that same document. When creating an HTTP Response Header Rule you set the following values:
Description: Whatever description you want here
Type of Rule: HTTP response headers
Incoming URL Pattern: /*
HTTP response codes: 200, 206
Expires header:
Always add header (override applcation’s header) 
Specify as number of days (selected)
Custom Headers (see table below and attached image):
X-XSS-Protection1; mode=blockChecked

Once you have filled in the Web Site Rule correctly, save and close it. 

After it is saved you can restart the Domino HTTP task to activate the new rules (you don’t need to restart the whole Domino server).  Once the HTTP task is restarted you can go to the following website to scan and check the status of your domain and all 5 headers except Content-Security-Policy should now be recognized and you should have an “A” rating assigned (please see attached image for similar results):

Here are all five of the settings and their corresponding values:
Strict-Transport-Security: max-age=31536000; includeSubDomains
Referrer-Policy: same-origin
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block

Another way to add security headers is to add a reverse proxy in front of your Domino server.  The reverse proxy can reside on the same server as the Domino server or separately in the DMZ to provide additional security.  Setting up a reverse proxy to add the http response readers is for a future topic.

DIIOP over SSL and SHA-2 on IBM Domino (XWork) Server 9 don’t mesh

IBM has been working hard recently to shore up the SSL vulnerabilities with Domino and XWork server. They have been applying fixes to both Domino 8.5 and 9.0 requiring multiple interim fixes over the past few months. Most recently IBM has made support for SHA-2 SSL certificates available as well so users with Domino websites can finally upgrade their SHA-1 SSL certificates to SHA-2 for better security. However, the new security updates have also had some unsuspecting side effects with other IBM Domino tasks running on the server. In particular we came across a problem accessing the Domino server via DIIOP over SSL.

Normally it is fairly easy to use a Java application to access Domino via DIIOP with or without SSL. With SSL it is necessary to use the TrustedCert.class file generated by the DIIOP task on Domino located in the /domino/java folder located in the Domino data folder. However, most recently accessing DIIOP over SSL was not working on our Domino 9.0.1 Fixpack 3 Interim Fix 2 server using a SHA-2 SSL certificate. Even with the proper TrustedCert.class file access was failing.

Looking at the Domino server console we noticed the following error when the DIIOP task was started:
DIIOP Server: Agent error: keyrng: Could not read certificate

We also noticed the following errors at different intervals when trying to access DIIOP over SSL:
TLS/SSL connection X.X.X.X(63149)-X.X.X.X(64023) failed with rejected protocol version

Also from the Java application trying to connect to DIIOP over SSL we saw the following error:
NotesException: Session closed due to communications failure

To fix: TLS/SSL connection X.X.X.X(63149)-X.X.X.X(64023) failed with rejected protocol version enable SSLv3 on the server.
To do so set “DISABLE_SSLv3=0” in the notes.ini

The same Java code was working fine against a Domino 9.0.1 FP3 IF3 server with a SHA-1 certificate, so, after enabling SSLv3 we switched out the SHA-2 certificate with a SHA-1 SelfCert created by Domino and all was good in the DIIOP world again.

Note that making DIIOP over SSL work opens up your HTTP server to SSL vulnerabilities, so it would be best to either disable HTTP altogether or put the server behind a reverse proxy such as Apache or IHS to protect the HTTP task while allowing access to DIIOP.

We checked the above with IBM and, as of this writing, IBM acknowledges that DIIOP over SSL with a SHA-2 certificate with SSLv3 disabled is not currently supported. They are working on the issue, but this may not be resolved in the near future.

Other Errors you might have seen on the Domino Server Console when investigating:
TLS/SSL connection X.X.X.X(443)-X.X.X.X(11951) failed with server certificate chain signature alogrithms NOT supported by client
TLS/SSL connection X.X.X.X(443)-X.X.X.X(11951) failed with server certificate chain requiring support for SHA384
HTTP Server: SSL handshake failure, IP address [X.X.X.X], Keyring [cert.kyr], [SSL Error: Invalid SSL message], code [4166]HTTP Server: SSL handshake failure, IP address [X.X.X.X], Keyring [cert.kyr], [SSL Error: Invalid peer], code [4171]
Tagged with:

The Trials and Tribulations of taking Quickbooks to the Cloud (Part I)

Recently our organization made a concerted effort to move all of our main server applications to the Cloud.  We wanted to significantly reduce the amount of time we spent internally managing and maintaining our servers so we could dedicate our time more to our core business.  One of the servers we moved to the Cloud was our #Quickbooks Accounting server.  Internally this was a virtual machine running Windows 2003 accessible via remote desktop client (RDP) from inside the firewall or from the outside via VPN.
We considered a couple of options before moving Quickbooks to the Cloud:

1. Have our Quickbooks hosted by a third party hosting company with automatic backups

2. Host Quickbooks ourselves using a Dedicated or Virtual Private Server from an Internet Hosting company

Hosting Option 1 was initially the more attractive choice since we could offload the whole process to a third party by simply uploading our Quickbooks data file and other important related files after an account was created for us.  Furthermore, Option 1 meant that we would not have to deal with backups since the hosting company automatically handled that for us on a daily basis.  Going with Hosting Option 2 meant that we would have to install Quickbooks ourselves, setup the various user accounts to access via RDP and create our own remote backup process.  From a monthly cost perspective Hosting Option 2 was a little less expensive since Quickbooks hosting companies usually charge a per user/per month charge whereas the VPS would be a fixed monthly cost for as many users as you could have supported by the Quickbooks application (in our case a total of 3).

Since our goal was to reduce our time managing servers and applications, we decided on Hosting Option 1 and signed up with a third party QB hosting company.  The sign up process was fairly painless.  We filled out a form, gave the company our QB licensing information, users’ names, and away we went.  By the following day we could upload our Quickbooks data and start accessing our accounting system.  To print we had to install a special universal print driver that worked with RDP, which was also fairly painless to setup. 

The first few days with the hosted QB solution, everything was going quite well until we were no longer able to access our Quickbooks, all of a sudden.  We contacted customer support and within a couple of hours we were back up and running.  This happened more than 3 times over a 3 week period.  At one point they moved our account to a different hosted server to solve our access problem.  In addition, accessing Quickbooks would slow down dramatically during the day as, we assumed, other users were accessing their hosted solutions at the same time.  After our final customer support request was solved, we could only start our Quickbooks session in Single-User Mode and then had to switch to Multi-User Mode manually after each logon.  It would never stay in Multi-User Mode after logging off of the application.  Although this was a minor inconvenience, it still meant having to contact the person currently logged in to ask him or her to switch to Multi-User Mode when other users wanted to access QB.

The customer support of the hosting company was very responsive and they would always end up solving our problems.  However, one of the reasons for moving to the Cloud was to not have to deal with these kinds of problems anymore.  We wanted access to our accounting system without hiccups if at all possible.  Although there is never a perfect solution, when we had QB hosted internally we had very few issues related to accessing our system, so it was frustrating to suddenly experience so many problems within the first 30 days.

Within a week or so of going with a third party hosting company and experiencing access problems, we started to be concerned with how reliable the backups were of our data.  Also, what would happen if the company suddenly went belly up and we could no longer get access to our Quickbooks?  Even worse, what if we could no longer get access to our data and months of accounting transactions were literally gone?  I’m sure, as we all ponder the advantages and disadvantages of having our data and applications in the Cloud, reliability of the Cloud providers will always be a major concern.  We are now putting our trust in a third party to serve up, backup and maintain something as important as our accounting system.

To help alleviate our concerns of no longer controlling our data, we looked into ways to regularly backup our QB data ourselves as a fail safe in case the unthinkable were to happen.  Here are the options we considered:

A. Manually run a backup of Quickbooks to our local drive from the Cloud when alerted to do so after so many times logging off of Quickbooks

B. Use the Intuit Data Protect (IDP) process to have our data automatically backed up directly to Intuit for a small monthly fee

C. Install a third party automatic backup solution 

D. Backup the data to a remote FTP server from the Hosted solution

Backup Option A was the simplest and cheapest of all.  There’s no cost to  having Quickbooks backup your data to a local drive from an RDP session.  After you use Quickbooks a few times, you can configure it to remind you to do a backup.  This was also our secondary backup scenario when we hosted the application internally.  Of course, when you backup from an internal server to a local drive on an internal PC, the backup is very quick.  However, from the remote hosting company to our local PC, the backup almost took 30 minutes to complete.   That meant that every time we wanted to backup QB after closing the application, we’d have to budget around 30 minutes to run the backup before we could disconnect from our session.  This made the backup too inconvenient where we’d probably never choose to run it defeating the whole purpose of the backup process in the first place.

So we turned to Backup Option B which appeared to be simpler than Option D and didn’t require us to install third party software as in Backup Option C.  We activated IDP for the 30 day trial with Intuit and setup the backup to run daily of our Quickbooks file.  It was great to know that the backup would run even the Quickbooks file was open at time.  In practice this turned out to be more complicated than anticipated.  Since our user accounts on the hosted solution were locked down so tight, it took a little legwork and intervention from customer support to get IDP activated and working in the first place.  When we finally had it working, within a couple of days, IDP could no longer contact the remote Intuit backup server.  We assumed that the hosting company blocked access to the Internet port required by IDP to run properly making it an unworkable solution for us. 

We also soon realized that both Backup Option A and B only backed up our QB data file.  No other files were being backed up.  That includes logo, attachment, and other important files that you may have added to Quickbooks for your invoices and other records.  This meant that a full restore, although still getting us our most important accounting data, we’d have to somehow replace all of the other files manually to get our setup to work the way it did before a restore was necessary.

Since Backup Option A and B were no longer viable, we tried Backup Option C.  However, this was an immediately found out to not be possible at all.  The Hosting company had blocked the installation of third party software including backup software solutions, so we had to cross Backup Option C off of the list.

We were left with Backup Option D.  We researched, developed and created an automated backup solution using a third party zip archiving utility, a Windows FTP utility and a Windows batch (.bat) file.  We were even able to automatically append the current date and time to the file name of each backup to perform a simple type of versioning.  Surprisingly the Hosting company did not block the installation of these third party utilities.  I can only assume this was because they were simple utilities.

With the automated backup finally working and after many problems accessing our third party hosted solution, we realized that maybe we should cut our losses and just go with Hosting Option 2: Host Quickbooks ourselves using a Dedicated or Virtual Private Server from an Internet Hosting company.  So we investigated various Internet Hosting company solutions and finally found one that met our budget and hosting criteria, installed Quickbooks, uploaded all of our data and setup our RDP accounts.  Within a few hours we were up and running.  We also installed the necessary software to get the backups working too. 

Our main accounting person was delighted.  The VPS was many times faster to use, printing was very simple with the built-in TS printer ports, and the one-click backup after closing QB, took less than a minute to backup the entire QB folder with an appended current date and time to the filename for versioning.

As you or your company move to the Cloud, I hope that this recounting of our Quickbooks hosting experience will help you in choosing the right path for your needs.  With so many SMBs still using Quickbooks to handle their day-to-day crucial accounting tasks, having a secure, reliable and recoverable Quickbooks environment is an essential part of Cloud Computing.  I’d also like to reiterate how important it is for your company to maintain control of its accounting data if the unthinkable happens.  #CloudComputing is the future, but there’s no reason why you have to give up full control of your applications and data.  At least in this case, you “can have your cake and eat it to”.

Lastly, I’d like to point out why using Backup Option D was also the best and most comprehensive backup to go with anyway.  Backup Options A and B, only backup the Quickbooks data file.  They don’t backup any other additional files or Quickbooks attachments that you may have outside of the QB data file.  If you restore only from the data file you will have your crucial accounting data, but nothing else.  Backing up the entire Quickbooks folder gives you true disaster recovery with a way to get right back to your accounting after a restore.  Everything is restored exactly as it was with the last backup.

A word of caution about signing up for the Intuit Data Protect 30 day trial.  Be aware that once you sign up you cannot cancel the trial online using your Intuit account.  You have to hunt down the right Customer Support number and then spend quite a bit of time with them to cancel your account over the phone.  We did all that before the trial was up and still got charged after the 30 day trial was over and after our account was canceled.  I had to contact them again to get the charges reversed on our credit card.  It was very time consuming indeed.  So I would highly recommend to think very carefully before signing up.  If they set up a way for you to cancel online, since you are able to sign up online, then it would probably make the whole IDP experience a lot easier to deal with.

Please note:  The full details of how we ultimately got the automated backup process to work will be detailed in a separate post coming soon.  For this posting, I wanted to focus on our challenges hosting #Quickbooks in the Cloud.  So watch out for that upcoming blog posting.