A slow Sitecore PaaS environment is more than just the Web Apps. The backend is just as important as the front end. If this is not properly scaled, you will run into limitations.
Consider Elastic Pools
As a best practice, you should group your Azure SQL databases into Elastic Pools and provide an overhead large enough for the group. This ensures that a smaller database has enough flex room. It also reduces the amount of governance required for managing DTUs per database as this can now be set at the pool level.
Check out the managing and scaling Azure SQL databases guide with Elastic Pools document from Microsoft.
Per sizing, you will want to check the total or average usage all of the databases you want to add into a pool, and use the larger value. An as example, review the table below:
|Database Name||Average Usage (DTU)||Max Usage (DTU)|
The total average usage is 35 DTU, but the max usage is 55 DTU. Take this number and make your Elastic Pool holding these databases greater than 55 DTU. To be safe, ensure enough overhead over the max. I prefer 30% over and would use Azure SQL Database alerts to monitor the databases.
As far as how you group the databases, there are several options. By workload is a common approach where the Web and front end databases are in a pool, operational databases (think core, master, exm, processing, reporting) in another, xConnect and Cortex together, and Commerce databases in a final pool. The key is to test and determine what is governable for you.
Monitor Your Elastic Pool
Monitoring your Elastic Pool is a critical activity. There are a lot of great tools within Azure to do so.
On the Elastic Pool itself, you can review the overall pool resource utilization. You can further drill down into the database resource utilization to view average and max DTU.
As Sitecore is an ever changing environment, depending on user load and any new code, its critical to keep an constant eye on what impacts these changes have on your overall system health. View the “Monitor an elastic pool and its databases” article for more options .
Its Not Just DTUs
At the varying levels of eDTUs, there is also a max storage per pool and maximum databases depending on the basic, standard, and premium tiers. As an example, a Standard 100 eDTU pool allows for a max storage of 750 GB, but only provides 100 GB out of the gate without paying more for additional storage in addition to the eDTU cost.
So if you only need 50 DTUs but have 200 GBs of data, you would need to either go to a larger pool or pay for additional storage. In the unlikely case that you need more than 200 databases at the Standard 100 eDTU pool level, you would also need to move up to a higher eDTU pool.
Pricing and levels can be found using the Azure SQL Pricing Tool (select the DTU option)
IMPORTANT: When setting Elastic Pools, make sure to set your Per Database Settings to less than the overall Pool eDTU within the configure section of the Elastic Pool. This is critical because if you set a pool to 200 DTUs and also allow the max a single database could grow to 200 DTUs, your other databases will not have the power they require to operate. You can measure your per database metrics to determine an appropriate per database max.
What About vCore?
DTU based Elastic Pools are not the only option… there are vCores. vCore align more the traditional on premises SQL model in that instead of DTUs, you have Cores and RAM. There are still a max number of databases per the offering but it offers more control.
Pricing and levels can be found using the Azure SQL Pricing Tool (select the vCore option). Note that in both options, the pricing will vary by region.
Any Other Cool Elastic Pool Features?
Beyond just grouping the databases together, there are several strong Elastic Pool features. You can run elastic jobs to help with managing a large number of databases. Point in time restore allows recovery, while Geo-restore and active geo-replication offer additional levels of data security.
What Do I Do Next?
Before you move your databases into an Elastic Pool and set the max DTU value, you need to determine if your custom code is part of the reason the Sitecore databases have so much load.
Its not so much that you may have “bad code” (hint, check for bad code!), but knowing that even with the same amount of users, once you add or change code, the backend will get affected.
It’s akin to a bunch of people grabbing shopping carts at the grocery store… depending on how much they put in their cart… it will cause a longer checkout for everyone.