Conferences, no matter the size all have one thing in common, they all require the same thing to run from Breakfast until Dinner for the day (or multiple days) that the event runs. That thing they need to run on is cash. The less expensive the event is to the attendee, the more the team running the event will need to make up this cash from somewhere else, usually from Sponsors.
Can an event run without sponsors? Sure. But say goodbye to coffee, snacks, sodas in the afternoon, possibly lunch. And in most places, say goodbye to the venue. These are all the things that sponsors are paying for by showing up and being there; among potentially others.
What it boils down to, is at events, especially smaller events, thank the vendors. They gave up their time and their companies cash to talk to you.
The post Be sure to thank the sponsors appeared first on SQL Server with Mr. Denny.
Back when Azure and Azure Active Directory got Windows InTune pushing down setting, and specifically oddball settings changes were complex. In the newest release of InTune that is accessible via Azure and Office365 things have gotten much easier. There used to be a major gap, in that you couldn’t run PowerShell. You had to convert it into an EXE, then package it via an MSI and upload the MSI to Azure. Short story, it wasn’t easy.
Now, however, you just need to sign your PowerShell (which was much easier than I was expecting) and upload it to the Azure portal. Then tell Azure which users are assigned to use the PowerShell. After that give the system some time to push to your users, and the PowerShell will be run against the users as needed.
In our case, we’ve got a non-standard VPN configuration, but using PowerShell, I was able to create the VPN connection on users computers easily enough. Let’s look at how it was done. The first step in Azure it two bring up “InTune” from the service list.
After opening up the Intune menu select the Device Configuration option from the Intune menu. This will give you access to where you’ll upload your PowerShell scripts.
The next step will be to setup a Certificate Authority internally. While this isn’t needed, it’s recommended so that all the users get the CA configuration. From what I’ve been able to tell with a CA in place (and duly registered and synced with Azure) multiple users can sign code and make it available for download and execution by users. For a more extensive IT shop this is going to be critical. For smaller shops, this may not be needed, but it will make life easier.
If you opt not to setup a CA within the network and sync it to Azure, then you’ll need to upload the certificate being used to sign code, and you can only upload a single certificate.
Once the CA is setup and Azure AD sees it (via AD Sync I assume) the menus changes so you can download the sync software. This took about 10 minutes for me when setting this up.
These changes are all done using the “Certification Authority” menu option that you see under “Device Configuration.”
Once the Certificate Authority is setup, you can go into the PowerShell scripts section of the screen. From there just click the “Add” button to add a PowerShell script to Intune.
Once you’ve added a PowerShell script you can add a name for a PowerShell script and point Azure to the signed PowerShell script so it can be run by users. There’s no much under “Settings” to work with.
The first setting is, is this a user-level script or a system level script. By default, scripts are run by the system account, but there’s a lot of cases where you want things to run at the user level instead, so you’ve got both options available. My script was written as a user-level script, so I set this to “Yes.”
The second setting allows you to force the system to check if the code is signed by a known code-publisher or if InTune doesn’t need to be checked or not. When I was working with this, I left this at “No,” and everything worked exactly according to plan (I also had a CA setup and synced with Azure and Intune).
After creating the script, the Portal should take you to the details of that specific script. The next step would be to change to the “Assignments” page. This is where you configure which domain groups will have access to download and run the script.
When you select “Assignments,” you can select as many groups as are needed to assign to this specific script. Groups can be synced from on-premises, groups which are AAD/O365 only, or even dynamic groups, so users are added automatically based on how settings for the users are configured.
It may seem like there are a bunch of steps to get this completed, but realistically once the PowerShell script is written, it took about 5 minutes to setup the script to be pushed out. After that, it was just a matter of waiting for users systems to refresh and pick up the change.
The post Managing VMs via Azure Active Directory just got a lot easier appeared first on SQL Server with Mr. Denny.
The short answer is that yes there are ports that you’ll want to block outbound by default. There’s a variety of amplification attacks that you have the possibility of being a member of. These attacks aren’t against your systems, but you run the risk of your machines being used to amplify attacks against others. These could be DNS based, NTP Based, or other kinds of amplification accounts.
Occasionally I get notifications from Azure that they see these ports open, and that you should network Network Security Groups to closed the unneeded ports.
Two of the ports that I’ve needed to deal with recently are UDP 123 and 389. Blocking these was a minor issue but best practice.
Blocking these in Azure is super low risk and easy to implement,
To be clear there is no inherent risk of being in Azure compared to other platforms. These sorts of amplification issues can come up in any environment. The beautiful thing about Azure is that they monitor these outbound issues and report back to the end on what blocking needs to be done for successful implementations,
The post Should I be blocking outbound ports in Azure by default? appeared first on SQL Server with Mr. Denny.
Tweets, facebook posts and blog posts can be powerfull things. The have the ability to sway peoples opinions of others, to drive people to buy software, to sell stock, and to make bad decissions.
Posting cranky posts just to get clicks views and retweets does nothing useful but show that all you care about is showing that you want to stir the pot.
An example of a non-constructive tweet
There are lots of ways of being constructive without fanning the flames. In the above tweet the author just craps all over someone, I assume the people who made the service pack, with no context or any followup at all. I get that it’s only a tweet with 140 characters, but there’s ways to get context. In our next example we see exactly how. We have a thank you to Microsoft for the lovely lapel pin/magnet, but a warning to people who aren’t used handling rare earth magnets that they need to be kept away from kids. As it’s a longer post (from Instagram) there’s a link though to the origional where the rest of the post finishes with “These are dangerous.”. The warning is still given, but without just crapping all over the fact that somone went through the trouble of sending these out to the MVPs.
A constructive post
I think my message here is, think before you post. Think how it’s going to impact others. Not just those you want to have read it, but those who did the thing that you’re writing about. Maybe rephrase how you’re going to post that snarky post and it’ll have more of the desired impact. I can almost guarantee that the first tweet had no useful impact on the SQL Server product team, where as the second post would have had much more impact to the MVP team when designing the next round of awards.
The post Being critical without being a crank appeared first on SQL Server with Mr. Denny.