By default uploaded files are stored in the local disk of the machine where the tool is running. However, tool gives you the option to store uploaded files into S3 storage as well. Few customers might not want to store the uploaded files in the local disk, due to space limitation and availability concerns.
If you have the tool installed across multiple server instances behind a load balancer then it’s mandatory to store the uploaded files into s3 storage. Tool provides option to share the generated report between users. Say suppose a user uploads file into server instance#1. Seconds user tries to access the same generated report. Load balancer could route his request to server instance #2. In such circumstance seconds, user will experience HTTP 404 error if files are stored local disk.
Here are the two simple steps to store the uploaded files into S3 storage:
1. Create a file by name: ‘storage.xml’ and place in the root folder where the application is installed.
2. In ‘storage.xml’ insert the s3 bucket name, access key, secrete key as shown in the below format:
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <comment>Storage Configurations</comment> <!-- S3 Bucket Name where you would like to store the uploaded files --> <entry key="s3.bucketName">my-bucket</entry> <!-- Your AWS access key --> <entry key="s3.accessKey">my-access-key</entry> <!-- Your AWS secrete key --> <entry key="s3.secreteKey">my-secrete-key</entry> </properties>
Here are the steps to enable SAML authentication:
1. In the root folder, you will find ‘saml.xml’ configuration file. In this file set ‘auth.saml‘ property to ‘true’.
2. In the same saml.xml configure the Identify Provider properties.