Cloud can be tricky sometimes. Find out what scenarios we've ran into that are worth being mentioned and explained.
Within any environment, application and infrastructure logging plays an important role. As much as we would like our solution to be perfect, issues will always arise within a production environment. When they do, a good logging strategy is crucial.
And today I want to talk about some special cases that arise in Azure. It's about applications like WordPress or Magento (other can be listed, but you get the point) - applications that rely on databases to function and store a lot of logging information directly in their databases (usually MySQL databases).
You might probably think "But I already have logging in Azure and I can set it up directly through the portal - why the need to do a custom setup?". And you are right, partially - some logs (audit logs) are available directly:
But what about the information that is stored in the database? Remember - these apps that I am mentioning about rely on databases to save data - so, for example, information about how many incorrect logins a user had in the app can be only found somewhere in a specific table of that database that is serving the application. And examples can be many more - but to sum it up, most info at application level is logged/saved in the database. Azure audit logs only offer limited information - usually about connectivity to the DB and the queries that have been run on the DB.
"Why would I need such logs?", you might ask - Well, for monitoring purposes, of course. Maybe you need to be alerted when a configuration change is done in the app or if another account was created at application level.
If you want to do this so that Azure will alert you automatically, you'll have to architect something a bit special:
So, it goes like this:
1. First you will have to create triggers inside the application database and choose what information (from what table, what column etc) you want to log. The information that you need will be "saved" in a custom table inside the DB with the help of triggers.
Here in these triggers you can also do some data manipulations, if needed - it's all up to you.
2. Create an Azure Data Factory pipeline that moves the data from that custom table to a blob in Azure Storage. You can setup the ADF pipeline to run at a given period of time (e.g. - every 5 minutes) or to have a custom trigger, based on your preferences.
Make sure that this data is JSON formatted and UTF-8 encoded! - Azure Log Analytics supports only ASCII and UTF-8 format.
Here, in the pipeline, you can also do some data manipulations, if needed - it's all up to you.
3. Create an Azure Logic App that will execute when a blob is added or modified:
Why this? because each time the Azure Data Factory (ADF) runs the pipeline to move data from the database table to Azure Storage, it modifies (or adds, depending on how you set things up) the blob.
Configure this Logic App to fetch the blob content (practically the logs that you "exported" from the database in JSON format) and to "inject" it into Log Analytics with the "Send Data" connector of Azure Log Analytics Data Collector:
4. Every time the logic app will be triggered and executed, it will inject the information into a custom log inside Azure Logs Analytics:
5. All you need to do now is to create KQL queries and set-up Azure alerts, based on the logs generated with this custom configuration.
Now you can worry less about application level security, because you are doing monitoring in an automated way.