Exchange Public Folder migration code

Published On: 2019-01-21By:

This last week John Morehouse and I did a significant office migration with one of our clients. As part of the migration, we decided to move their public folders from Exchange 2016 to Office 365 so that their public folders were hosted in the same place as their mailboxes; allowing us to

https://www.flickr.com/photos/spidere/5101386622/in/

https://www.flickr.com/photos/spidere/5101386622/in/

decommission Exchange as the public folders were the last thing on the servers.

Microsoft has some documentation on your the migration, but the Microsoft documentation is rather lengthy and takes a long time to go through. We only had about 400 Megs of data to move, most of which is contacts and calendars. Instead of going through the rather lengthy process to move such a small amount of data, we instead opted to export the data through Outlook and reimport it to Office 365. This process required that we reset the permissions on the data after the migration.

The first thing we needed to do was to export the current permissions from the on-prem Exchange 2016 server. The export of the permissions was done via the Get-PublicFolderClientPermission PowerShell CmdLet. We combined this with Export-CSV as so.

Get-PublicFolderClientPermission “\” | select of user, identity, accessrights | format-table -width 500 | Export-Csv c:\temp\rights.csv -notypeinformation

This produced a fixed width text file that we could work with. We then opened the file in Notepad++ and did some Find/Replace operations to turn our fixed width file into a file we could more easily work with.

First, we searched for anything that has three spaces and replaced it was three {pipe} values (we had a couple of values with two spaces in them). We then did a search for {pipe}{space}{space} and replaced that value with {pipe}. We then did a search for {pipe}{space} and replaced that with {pipe}. We then ran a search and replace for {pipe}{pipe} and replaced it with {pipe}. We did this a bunch of times until there were 0 results found to the file. This case of a pipe delimited file that we could work with.

We then told Exchange to use the Office 365 version of the public folder, instead of the local version. We did this with the Set-OrganizationConfig CmdLet. Keep in mind, this CmdLet will delete all the data in your Exchange server’s Public Folder database, so make sure that you export your data from the public folder, and that you pull down your permissions BEFORE you run this.

Set-OrganizationConfig -RemotePublicFolderMailboxes $Null -PublicFoldersEnabled Local

After this was all done, we connected a PowerShell window to Office 365 and processed the text file to put the permissions back. This was done using a short PowerShell script.

$a = Get-Content C:\temp\rights2.csv
$a | foreach {
$data = $_
$arr = $data.Split(“|”)
$perm = $arr[2].replace(“{“, “”).replace(“}”, “”)
$perma = $perm.split(“,”)

$permb = [collections.arraylist]$perma

$data

Add-PublicFolderClientPermission -Identity $arr[0] -user $arr[1] -AccessRights $permb

}

This script told us what it was working on (in case there was an error), then it did the work (using the Add-PublicFolderClientPermission CmdLet). The biggest trick with this script was converting the “normal” array named $perma into the “collection.arraylist” named $permb and then using that.

This script ran for a few hours, as Add-PublicFolderClientPermission isn’t very fast, but the result was that the permissions were back where they should have been, the Public Folder was hosted in Office 365 instead of Exchange, and we were one step closer to being able to get rid of the exchange server. And most importantly, the users didn’t report a single problem after the migration was done.

Denny

The post Exchange Public Folder migration code appeared first on SQL Server with Mr. Denny.

When is the Right Time to Look at New Services?

Published On: 2019-01-14By:

Microsoft Azure is rolling out new features at a fantastic speed.  But when is the right time to evaluate those new features?  It might be right as the feature is released, it might be later in the lifecycle.  The basic answer to when to look at using new services is, it depends.

If the company has some problem that needs to be solved, that we can’t address today using available technology, or the solution we have built today could be made better, then a new feature or new service might be worth looking into.

If there’s nothing that needs to be solved, then the new feature isn’t helping you do anything. The feature is just shiny and new.

What it comes down to, is can the new feature potentially solve a problem that you or the company is having?  If you can’t solve the problem, then the new feature isn’t going to help you.  Now, this is going to mean some research into the new feature to see if it’s the right solution or not.  But if the feature that is being researched turns out to not be a solution to the solution to the problem (or a better solution to the problem than what you have today), then it’s time to move to another solution to the problem.

All too often, companies decide that they are going to use a solution to a problem, no matter what that right solution might be.  Even if the solution that they have decided must be used, costs thousands of millions of dollars a year.

Selecting a solution to a problem gets more complicated when there’s politics in the office involved.  All to often someone from upper management will decide that some product needs to be included in the solution.  This isn’t a new problem, either.

Back in the early 2000s, I was tasked with building a new knowledge base for a department at the company I worked at.  XML was getting popular and was being mentioned in all the magazines.  The knowledge base was supposed to be a normal relational database so that people could look up.  A Senior Manager wanted to use XML instead of a relational database.  I refused because XML wouldn’t perform well, XML wouldn’t scale well, and it would make no sense to build a database as a single XML file (which is what he wanted).  He insisted we use XML, and I asked him if he wanted to use XML, or he wanted the application to scale and perform well. It took a while to get him to see reason, but eventually, he saw reason and we used SQL Server for the relational data of the application.  And shockingly the application was able to be used successfully by thousands of employees on daily biases.

What it came to show, is that applications should be built to be successful, not to use some shiny bit of new technology.

Denny

The post When is the Right Time to Look at New Services? appeared first on SQL Server with Mr. Denny.

My 2018 Blogging By The Numbers

Published On: 2019-01-06By:

2018 was an great year for blogging for myself.  There was a decent amount of people reading articles that I’ve posted this year.  My numbers are a bit off, as there were some issues I didn’t notice when I did an upgrade of my WordPress plugin that counts all the views. My plugin recorded about 100k views of my posts.  Based on the stats that were recorded, my estimated page views are about 150k views for this year. Given the recovery this year, the lower number of page views make sense.

The most popular post that people were looking at was Difference between an Index and a Primary Key. With the next most popular post being my post on what MSDTC is, titled What exactly is MSDTC, any when do I need it?.

Other posts that were popular were about SQL Server Replication, Microsoft Ignite Announcements, SQL Server NUMA nodes, SQL Server NOLOCK, and a post about how important Disaster Recovry is.

All of this is a pretty huge spread of topics, but it gives me some idea what people are interested in reading about.

Here’s to an great 2019.

Denny

The post My 2018 Blogging By The Numbers appeared first on SQL Server with Mr. Denny.

DCAC Welcomes Our Newest Team Member

Published On: 2019-01-02By:

Today we start the new year (yes I know, yesterday was Jan 1, but it was a holiday) with a new team member at Denny Cherry & Associates Consulting. Peter Shire (T) is joining our team as our new Director of Sales.

Peter and Bill

 

Peter is a successful sales and marketing professional with vast experience and connections throughout the SQL Server community. He began his career with a eight year stint at Microsoft. Peter enjoys sharing that he “knew Bill when he wasn’t even a Billionaire. Here we are when he was a meager 470-Millionaire!”

Of course, many of you probably recognize Peter from his eleven years at SentryOne and or his long tenure as the President of the Charlotte SQL Server Users Group (CSSUG). While at SentryOne serving as the Director of Partner Relationships, he designed, built and managed their successful Partner program as well as their relationship with Microsoft and also SQL Server MVPs.

He helped resurrect CSSUG and grow it into one of the nation’s leading users group. The success of Charlotte’s SQLSaturday #33 – the first of the large multi-tiered events featuring 23 SQL Server MVPs, was pivotal to bringing the 2013 Summit to Charlotte. Peter’s influence on the community extends beyond Charlotte as he created unique events that brought renowned SQL Server experts Grant Fritchey and Kevin Kline throughout multiple PASS chapters in the South. Kevin Kline interviewed Peter in July 2017.

Peter is a husband, father to three daughters and wrangler of two golden retrievers. He loves college basketball but his favorite player of all time is still in high school. You can follow Peter via Twitter and Instagram.

We are extremely happy to be able to welcome Peter to our team.

Denny

The post DCAC Welcomes Our Newest Team Member appeared first on SQL Server with Mr. Denny.

1 2 3 4 339

Video

Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.