In the SQL Server community, we are basically one big #sqlfamily. This is a story of how the family helped one individual and one of the main reasons I am involved, to begin with.
In happenstance, both our children attend the same school. He happened to recognize me at a meeting at the school one evening. Well, he thought that I was the DBA with a Bat (my Twitter persona) and it turns out he was right. Of course, I didn’t have my bat at the time, so there is that.
Of course, after that, we got to chatting as most nerds do. This lead to a couple of lunches. Our chats eventually lead to Jerad telling me he wanted to really move into IT and away from the business side. Turns out, at the time, Jerad was a business analyst for a large organization here in Louisville, Kentucky. He had skills in SQL Server as he wrote a number of ETL processes along with reporting and data analysis. He attended user groups when he could.
That’s all it took.
Over the course of the following weeks, I introduced Jerad to Mallory, a local recruiter for TEKSystems. Mallory and her employer are both highly engaged with our local SQL Server presence. They quite often sponsor user group meetings as well as our annual SQL Saturday events. I knew that Mallory could help him get into a role that better suited his long-term goals for his career.
I also trusted Mallory. I knew that if I sent Jerad to her, she would take care of him as best as possible.
She didn’t disappoint. Over the course of a couple of weeks, Jerad accepted a new position with a smaller organization here in town, moving his career down the path that he wanted.
Could have Jerad done this on his own? Absolutely.
Did I get the job for Jerad? Not even close. He is the one that did the work with the interview process and eventually getting hired. I just opened the right doors for him. He deserves all of the credit here.
The point of this? Get involved in your local community. Regardless of the discipline of choice, get involved. There isn’t a local presence? Start one. Need help with that? Ask me. I’ll help if I can. Even if it isn’t SQL Server related.
Here’s a question to ask yourself: What opportunities are you missing by not being involved?
Get out there. Go to your local user group. Attend meetings. Talk to people.
If nothing else, meet someone and say “Hi, I’m [insert_name_here]”.
After all, that’s what Jerad did. Look at where he is at now.
My final question.
What are you waiting for?
PS – If you want to talk to Jerad about his experience, follow him on Twitter. He’s good people. Mallory too.
VMware recently announced they’re vExpert list for 2018 and I’m proud to say that both Joey and myself were on the VMware list for 2018. The VMware vExpert program is designed to reconize those in the VMware community who help out with the VMware community at large. While the vExpert list looks rather large, it’s a big world and there’s a lot of people working around the world helping VMware users to setup varios VMware products. I know that both Joey and myself and thrilled to have been awarded this award for 2018.
In my experience, there have been occasions where SQL Server Management Studio (SSMS) becomes unresponsive for a length of time. In one particular case, this occurred while I was expanding the list of databases. SSMS just sat there and waited for some unknown reason and became frustrating. The client eventually did respond and life was good again.
The frustrating part is not knowing what is going on behind the scenes. However, I’ve recently learned that you can see what the client is doing when certainly actions are performed, such as expanding the list of databases from within Object Explorer. In fact, there is a query is executed. But how do we see that query?
This window displays status messages for various features in SQL Server Management Studio. Output is delivered to special panes in the Output window from the Transact-SQL debugger, external tools features, or commands that are run in the debugger Command Window
You can enable the Output window by selecting it from the View menu.
When enabled, you can now see the query that is performed when doing certainly tasks within Management Studio. For example, when expanding the Database branch, I can see the resulting query:
Note that you also get some statistical information about the query such as the elapsed time. You can manually execute this query and see the results. I think that it is interesting while the UI only displays the database name and some metadata about the database, such as if it’s in recovery or read only, the query in reality returns a great deal more information.
Another benefit of the Output window is Telemetry. The fact is that Microsoft needs to get feedback in order to improve their products Microsoft needs to get feedback in order to improve their products. The only way that they can get this feedback is by receiving some type of telemetry on how things are working. This can be done either anonymously or not. After spending this week in Redmond, Washington at the 2018 Microsoft MVP Summit, I can assure you that they really want feedback.
You can see this telemetry by selecting it from the drop down menu.
Once telemetry has been selected, you will then see the data that is gathered.
You can configure SSMS to either automatically send this data back to Microsoft or not. The choice is completely up to you whether you do or not. Keep in mind that Microsoft does not collect any sensitive data.
Specifically, Microsoft does not send any of the following types of information through this mechanism:
Any values from inside user tables
Any logon credentials or other authentication information
Personally Identifiable Information (PII)
Also, since we are on the topic of telemetry, my colleague Joey D’Antoni interviewed Conor Cunningham, partner architect for SQL Server/Azure SQL at Microsoft, where they discussed telemetry and how this helps various SQL Server products (both on premise and in the cloud) to be better. The two part series is a good read and I’d recommend taking 10 minutes to do so. You can find the first post here.
The output window in SSMS offers up deeper insight on what the client is really doing behind the scenes. This is useful for when SSMS isn’t behaving as expected and you can use this tool to help identify potential issues by seeing the queries being executed. Also, it is handy to have the ability to see some of the telemetry that might be sent back to Microsoft. While some will think this is big brother watching over us in a creepy way in reality Microsoft uses this information to make the products better
The grey hairs I see every now and again remind me of how long I’ve been working in IT. I’m now in my 22nd year of having a job (2 years as an intern, but I did have the ‘sys’ password), and I’ve seen a lot of things come and go. When I started working servers cost at least tens of thousands of dollars, were the size of refrigerators, and had less processing power than the Macbook Pro I’m typing this post on. More importantly, they had service contracts that cost a rather large amount of cash per year. This bought you, well I’m not sure, but your hardware reps surely took you out for at least a steak dinner or two. Eventually, we moved to commodity hardware (those boxes from HP and Dell that cost a tenth of what you paid 20 years ago) and service contracts were a few hundred dollars per server per year. (And then those sales reps started selling expensive storage).
Most of my career has been spent working in large Fortune 500 enterprises—I think about things that at the time were only available to organizations of that size and budget, and are now available for a few clicks and a few dollars per hour. I’m going to focus on three specific technologies that I think are cool, and interesting, but as I’m writing this, I’ve already thought of three or four more.
Massively Parallel Processing
MPP processing is a data warehouse design pattern that allows for massive scale out solutions, to quickly process very large amounts of data. In the past it required buying an expensive appliance from Teradata, Oracle, Netezza, or Microsoft. I’m going to focus on Microsoft here, but there are several other cloud options for this model, like Amazon Redshift or Snowflake, amongst others. In terms of the on-premises investment you had to make, effectively to get your foot in the door with one of these solutions, you were looking at at least $250k/USD which is a fairly conservative estimate. In a cloud world? SQL Data Warehouse can cost as little as $6/hour, and while that can add up to a tidy sum over the course of a month, you can pause the service when you aren’t using it, and only pay for the storage. This allows you to do quick proof of concept work, and more importantly compare solutions to see which one best meets your needs. It also allows a smaller organization to get into the MPP game without a major investment.
Secondary and Tertiary Data Centers
Many organizations have two data centers. I’ve only worked for one that had a third data center. You may ask why is this important? A common question I get when teaching Always On Availability Groups, is if we are split across multiple sites, where do we put the quorum file share? The correct answer is that it should be in a third, independent data center. (Which virtually no organizations have). However, Windows Server 2016 offers a great solution, for mere pennies a month—a cloud witness, a “disk” stored in Azure Blob Storage. If you aren’t on Windows Server 2016, it may be possible to implement a similar design using Azure File Storage, but it is not natively supported. Additionally, cloud computing greatly simplifies the process of having multiple data centers. There’s no worries about having staff in two locales, or getting network connectivity between the two sites; that’s all done by your cloud vendor. Just build stuff, and make it reliable. And freaking test your DR (That doesn’t change in the cloud)
If you aren’t using multi-factor authentication for just about everything, you are doing it wrong. (This was a tell that Joey wrote this post and not the Russian mafia). Anyway, security is more important than ever (hackers use the cloud too) and having multi-factor authentication can offer additional levels of security that go far beyond passwords. MFA is not a new thing, and I have a collection of dead SecureID tokens dating back to the 90s that tell me that. However, implementing MFA used to require you to buy an appliance (later it was some software), possible have ADFS, and a lot of complicated configuration work. Now? It’s simply a setting in the Office 365 portal (if you are using AAD authentication; if you are an MS shop learn AAD, it’s a good thing). While I complain about the portal, and how buried this setting is, it’s still far easier and cheaper (a few $ a month) than buying stuff and configured ADFS.
These a but a few examples of how cloud computing makes things that used to be available to only the largest enterprises available to organizations of any size. Cloud is cool and makes things better.
As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.
And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.