Recently I was writing a PowerShell script to help a customer automate the process of deploying Azure SQL Databases. One of the challenges of automation that I remember since the dawn of time, is how to secure credentials in your automation script. Back in the old UNIX days, we used permissions to protect files and then read in the password files, but it was messy and probably not as secure as I would have liked.
One of the benefits cloud computing has offered is building out a lot of infrastructure and opportunities for smaller organizations to take advantage of structures that used to only be available to large enterprises. A good example of this is Azure SQL Database geo-replication—in the past if you wanted to have a database in four regions throughout the world, you had to lease space in four data centers, build a global network, and possibly even get people in place in different parts of the planet to make sure those machines stayed running. Now, with a few mouse clicks you can have your database on four continents (and for as cheap as $20/month, or realistically $1000/month)—this is where we see the real benefits of cloud computing. Another one of these components is Azure Key Vault—in the past Hardware Security Modules provided root trust amongst other security features. Now, in Azure, we can use Key Vault for password management, certificate management, and hardware trusts.
Key Vault is especially handy when trying to pass in a password to a script. Since it’s fully implemented with PowerShell, CLI, and Rest API, we can easily call it in a script. This script example is pretty basic, but it’s all I needed to securely pass a password into my automation job.
The first thing you will need to do is create a key vault, and then create a secret. Once your secret is created, you will be able to reference it in code.
I pasted this image for readability, but you can see my code example on GitHub here. It’s pretty basic—I’m defining a variable called password, and getting from the Key Vault, and then passing it into the -SQLAdministratorCredentials in New-AzureRMSQLServer.
Azure is a great platform to use, and the new Managed Disks are fantastic as you just need to set it and forget it. However, if you run into the problem that I can recently with moving the VM from one resource group to another, you’ll run into a stumbling block pretty fast. I hit this because I put a VM in the wrong resource group, then the customer installed a bunch of software of the VM, then I went to move it to another resource group. That’s when I got the lovely error that said “Operation ‘move’ is not supported on Resource ‘MyVM’ with managed disks. (Code: BadRequest)”.
This error was a bit of a problem as I didn’t want to blow away the VM. I had to turn on a couple of Azure features in PowerShell (eventually you won’t have to do this, maybe you don’t have to do this already) and I was able to move the VM without issue. After running the following commands (and using Get-AzureRmProviderFeature to wait for the feature to be registered), the move worked.
Register-AzureRmProviderFeature -FeatureName ManagedResourcesMove -ProviderNamespace Microsoft.Compute
Register-AzureRmResourceProvider -ProviderNamespace Microsoft.Compute
At this point, I was able to move the VM and all it’s resources to the correct resource group using either PowerShell or the Azure Portal (I assume CLI worked as well, but I didn’t need to do more testing).
The post Moving VMs in Azure with Managed Disks Between Resource Groups appeared first on SQL Server with Mr. Denny.
Data exploration and documentation can be cumbersome, classifying data can be one of those tedious but necessary things. With the introduction of such things as GDPR there is an ever greater need to classify and label your sensitive data. Microsoft attempts to help us out with this in the new SQL Data Discovery and Classification feature in SQL Server Management Studio (SSMS). This a new feature is available in Azure SQL Database and SSMS 18.0 and higher. It can be used against databases in SQL Server 2008, compatibly mode 100, and greater.
This feature will scan through your database tables and identifies columns that contain possible sensitive data. It then allows you to categorize that data as well provides a detailed report for auditing and compliance purposes.
Let’s see how it is done.
In SSMS on the database you want to use Right click on Tasks, then choose Data Discovery and Classification and Choose Classify Data. In this example, I am using the AdventureworksDW2016CTP3 database.
A results window will pop up showing how many field recommendations it has found. Click on it to view them..
When you view the data, you will see your data broken down by Schema, Table and Column. SSMS then attempts to categorize (information types) the information and estimate a sensitivity level (sensitively label) . It allows you to accept the recommendation by checking the box on left hand side. If it’s not quite what you wanted, you can adjust the Information Types and Sensitivity. Once you are satisfied with the category assignments, click on the blue “Accept selected recommendations” button located at the top and choose Save.
As shown below, I have classified 10 columns and have 64 left unclassified
It also gives me the ability to manually add a field and classification by choosing the Add Classification button at the top.
As I mentioned previously, this feature provides a way to report on the data classification.
You can retrieve the report by clicking View Report, located adjacent to the Add Classification button at the top.
*Note if your report is blank you forgot to hit SAVE while classifying.
The report breaks all your data down nicely by schema, information types and sensitivity. You can see how this can be very useful and insightful when reviews data for compliance. It does take a little time to go through and validate the results of the classification. While this process might be lengthy to accomplish, in the long run, it is well worth the time.
There’s been a lot of discussions recently about SQL Agent jobs, proxy accounts, and job ownership. I wanted to try and clarify some of the myths out there, including job ownership and permissions.
The owner of a job is the context of the account, that the SQL Agent Job runs as. This account by default will be the user that creates this job. Normal application level jobs can have this be changed to an application level account, or another non-privileged login which has permissions inside SQ: Server to do the work that it needs to do. ]
The account that SQL Server Agents runs as must have sysadmin rights within the SQL Server Instance otherwise the SQL Server Agent will not be able to run. Also maintenance jobs that rebuild indexes or update statistics are going to need to have at least Database Owner rights within the databases, if not sysadmin rights within the instance.
Jobs for things like Replication and CDC should generally be left alone with sysadmin rights as that’s what they are going to need to function.
Jobs that are created by your application should run with whatever permissions that they need to function.
If you want (or need) to run job steps, where each job step executes as it’s own account, then proxy agents are going to be the way to you. You configure Proxy Accounts per job step in SQL Agent, so you can create different proxies for each job step if you want/have to. These proxy accounts can have whatever permissions the job needs to complete the task at hand. If a job step needs sysadmin rights to complete then you can create a SQL Server proxy for that step, if a job step needs minimal rights for the job step to run, then you can safely grant the proxy just the permissions that it needs.
If you want to give more into the security of SQL Server than I’d highly recommned that you look over a copy of my book, Securing SQL Server and check out my precon at the PASS Summit in 2018.
The post Who should own your SQL Agent jobs? appeared first on SQL Server with Mr. Denny.