6 Deploy Analytics for LS Central
Now it is time to run the Deployment script you downloaded with the product package earlier, but first you need to make sure that you have VS Code and PowerShell installed on the Scheduler server. Follow the steps below to deploy Analytics.
Configure PowerShell
To run the script, the machine should have Power Shell installed. We recommend setting up VS Code with PowerShell extension and then configuring PowerShell as described below.
You must have the new Azure PowerShell Az module installed. For further instructions see Introducing the new Azure PowerShell Az module.
Note: You must open the PowerShell 'as administrator' to be able to set up the module.
In case you do not have the Az module installed, run the following command as Administrator in PowerShell:
Install-Module -Name Az -AllowClobber
You must also register the AZ resource provider by running these lines in PowerShell:
Connect-AzAccount
If you have issues with connecting to the correct Azure account, you can add the subscription and tenant/parent management group to the connect command:
Connect-AzAccount -Tenant 'Tenant/ParentManagementGroupIDfromAzure' -Subscription 'SubscriptionIDfromAzure'
This opens an Azure login window in the background where you must log in before you continue with the next line:
Register-AzResourceProvider -ProviderName Microsoft.DataFactory
In order to install bicep, you need to copy and run this scrip from Microsoft:
# Create the install folder
$installPath = "$env:USERPROFILE\.bicep"
$installDir = New-Item -ItemType Directory -Path $installPath -Force
$installDir.Attributes += 'Hidden'
# Fetch the latest Bicep CLI binary
(New-Object Net.WebClient).DownloadFile("https://github.com/Azure/bicep/releases/latest/download/bicep-win-x64.exe", "$installPath\bicep.exe")
# Add bicep to your PATH
$currentPath = (Get-Item -path "HKCU:\Environment" ).GetValue('Path', '', 'DoNotExpandEnvironmentNames')
if (-not $currentPath.Contains("%USERPROFILE%\.bicep")) { setx PATH ($currentPath + ";%USERPROFILE%\.bicep") }
if (-not $env:path.Contains($installPath)) { $env:path += ";$installPath" }
# Verify you can now access the 'bicep' command.
bicep --help
# Done!
Note: If you have already downloaded the product package, you can also run the InstallBicep.ps1 from there.
Run Deployment script
- Navigate to your Base Folder, that is the folder where you downloaded the Analytics Product package earlier.
- In the Base folder, right-click an empty space in the folder and select Open with VS Code.
- When you open the folder like this you have access to all the files in the folder from VS.
- If you have collected your parameters into a file you can add that file to the Base folder and you can view it from VS Code as well.
- If you do not want to use VS Code, you can open the DeploymentScript.ps1 using PowerShell ISE, or your favorite PowerShell editor.
- Run the script (F5).
- The script will check for the Az module setup mentioned earlier, and if the module is missing, the script will terminate and you can set it up at this time by following the steps in step 3 of the wizard.
- The script will check for SqlServer module and install and import if needed.
- If the script detects old Analytics modules it will remove them and import the modules from the current Base folder.
General information
First, the script will collect some general information about the Analytics setup.
- First, the script prompts for Azure subscription ID:
- Enter the ID you saved earlier and click Enter.
- Next, you will be prompted to enter your Azure account login information.
The Azure login window will pop up in a different window but should get focus, and if you have already logged into Azure in a browser your account should be visible and you can just press Enter. If not, you need to enter your Azure login credentials.
- Now the script asks you to select the resource group you want to use from a list of resource groups collected from Azure:
- Enter the number for the resource group you choose, and click Enter.
LS Central information
Next, the script will prompt for LS Central source information. For this setup, this is only the company name since all connection to the LS Central database is handled by the Scheduler server.
- The script will prompt for which type of LS Central you have.
- Here you should always enter 1 for Cloud (SaaS).
- If you do not have LS Central in cloud, go back to the onboarding overview page and select the LS Central on-premises option that best fits your needs.
- The script prompts for the name of the company you want to use in the Analytics setup.
- An empty Companies.txt file is opened.
- Enter the company names exactly as they are displayed in the Companies table, Name field, in LS Central. If you have more than one company you want to include in your Analytics setup, enter each name in new line.
- Save the file.
- Click Enter in the script and the script will prompt for verification of the companies you entered into the file.
- Enter y if they are correct or n if you want to edit the file.
- If you select n, edit the file, save, and then click Enter in the script.
Analytics information
The script now prompts for information relating directly to the setup of Analytics in Azure:
- Next you will be asked whether you want to set up the Analytics database in Azure or on-premises.
- Here the only option is Cloud, so you just click Enter.
- If you have LS Central in cloud, but would like to set up the Analytics database on-premises, you can contact us by registering a support ticket through the LS Retail Portal (login required).
- Next, the script asks whether you want to create a new SQL server in Azure or use an existing one. The number of existing SQL servers in the resource group you selected are displayed in parenthesis behind this option. We recommend setting up a new server, if possible.
- Enter 0, if you want to create a new server.
- Enter 1, if you want to use an existing server.
- If you choose to create a new server, you will next be prompted for a name for the new server. Otherwise, go to step 4 below.
- Enter a name for the new server name in lowercase letters, numbers and hyphens only, and click Enter.
Then you will be asked to create a new server admin login:
- Enter a user name for the new admin user, click Enter.
- If the user name does not meet the rules provided in the script, you need to choose a new user name.
- Enter a password for the new admin user, click Enter.
- If the password you enter does not meet the rules provided in the script, you need to enter the user name again and select a new password.
- Confirm the password by entering it again, and click Enter.
- Make sure you save this password somewhere or select something you will definitely remember since this password is not saved anywhere and you will need to provide it when connecting the Power BI reports to Analytics later in the process.
You can now move directly to step 5 below, since step 4 only applies if you selected to use an existing Azure SQL server.
- If you choose to use an existing server, the script will display a list of existing server names for you to choose from.
- Enter the number of the server you want to use, and click Enter.
You will then be asked to provide the admin login credentials for the server you selected:
- Enter the user name for the admin user of the server you selected, click Enter.
- Enter the password for the admin user of the server you selected, click Enter.
- Now the script prompts for database name for the new Analytics database:
- Enter a name for the database, click Enter.
- Then select which Azure pricing tier you want to use for the database:
- Enter the number for the pricing tier you select. We recommend selecting tier S2.
- You can read more about the pricing tiers on the Azure cost calculation page.
- The script now prompts for a name for the Azure data factory that will be created in Azure and contains all the pipelines needed to move data between database and tables.
- Enter the name for the Azure Data Factory (ADF), click Enter.
- Note: The ADF name must be globally unique in Azure, so we recommend that if you want to call it LSInsight that you add some additional letters or numbers to the ADF name.
- Next the script offers to use that same location for your Azure resources that is set on the resource group you selected before and displays that location.
- This is most likely what you want to do and then you enter y and click Enter.
- If you for some reason want to select a different location then you enter n at this point and click Enter.
- The script will then look up all allowed locations in your subscription which will take a few minutes and then ask you to select the location you want.
- Enter the number of the location you select and click Enter.
- The script will now display a summary of the Analytics parameters you have selected and entered. If everything is correct, you can just enter a y and the script will continue, but if you do nothing or enter any other letter, the script will start collecting the Analytics setup parameters again.
Install Analytics
- The script will now create the resources in Azure and create the pre-staging tables in the Analytics database.
Tip: This usually takes about 5-10 minutes. Sometimes the import of the SQL database takes longer, caused by load on the Azure service. In that case you continue creating the Azure Data Factory, but you cannot continue the onboarding process until the database resource has been created. How to check this is explained later in the onboarding process.
The script will print out the following lines as the resources are created:
Installing Analytics for LS Central...
Creating a new Azure SQL server. This might take a few minutes... (This will not appear if you selected existing server)
Adding firewall rules...
Creating a new Azure SQL database. This might take a while...
Creating a new Azure Data Factory v2. This might take a while...
Adding companies into the Analytics database...
Creating the pre-staging tables in the Analytics database...
Adding companies into the Analytics database...
-
If at any point there is a validation error or something goes wrong, the script will terminate and print out a red error message. This most often explains the issue and the error is also written to the error log in the base folder. Some errors are clear, but others might be more cryptic and then it can be good to check the troubleshooting section of the help.
When you run the script again after an error occurs, the script tries to reuse the parameters you selected before but asks for verification, so you must be careful to answer n if you want to change something.
-
Once the script is done, it will:
- Print out the parameters needed to connect the Power BI reports to the Analytics database, except from Analytics admin user password.
- This saves you the lookup in Azure for the server path. If you want to close VS Code it is a good idea to copy this information and save it somewhere for safe keeping.
- Notify you by printing out a message (Done!).
- Add a new folder (YYYYMMDDHHMM-MyProject) in the Base Folder.
- Save the deployment information to the Parameters.json file in the MyProject folder.
Note: In this setup of Analytics the source and Analytics database is one and the same, since the pre-staging tables serve as the source for the Azure pipelines.
- Print out the parameters needed to connect the Power BI reports to the Analytics database, except from Analytics admin user password.
Verify creation of resources in Azure
- Open and log into Azure portal.
- You will find three resources created with the deployment script:
- Azure Data Factory
- SQL Database
- SQL Server
If the SQL database has not been created because of a heavy load on the database import service, you can check again the next day or monitor the import progress through the Azure SQL server that was created.
- In Azure Portal, open the SQL server from resources.
- From left menu, select Settings - Import/Export history.
- Click the line for the database and it will show the status and start time.
- The import should finish within a few hours. If it fails, you can just delete all the resources created and run the deployment script again.
Once the database import has completed, you need to manually add the companies before triggering the Initial load in the Azure Data Factory. An SQL script file named AddCompanies.sql has been created in the My-Project folder in the Base folder, that can be used on the Analytics database to add the companies. The script executes the stored procedure [dbo].[LSInsight$InsertCompany] with the name of each company.
It is also possible to trigger the pipeline 1 - Analytics query setup - Add or Delete Companies and add the companies in the Companies parameter.