Office 365 Lync displays wrong name

Recently we have come across an interesting bug in Office 365. In the scenario where you use ADFS to authenticate your Office 365 users and some of the users have multiple email address aliases assigned using adsiedit.msc, Lync might display a wrong name.

For example, user’s name is Walter and his primary email address is (not a real email address). Imagine that Walter’s colleague Jesse is leaving the company and they need Walter to take over Jesse’s clients and make sure that all emails that addressed to Jesse are now sent to Walter. At the same time, you don’t want to keep Jesse’s mailbox active because Office 365 charges you per mailbox and that would be a waste of money. So, you archive Jesse’s existing mailbox and add an alias to Walter’s mailbox. And, because you use ADFS, you have to add aliases using adsiedit.msc instead going through Office 365 management portal. Make sense, right? Well, this is where it starts being interesting and very-very confusing. Now, when Walter logs into Lync some of the users will see Jesse’s name show up in their Lync client instead of Walter. Weird, isn’t it?

What appears to be happening is that Lync Address Book Service (ABSConfig) queries proxyAddress attribute in user properties and uses whatever entry query returns first. Because proxyAddress field stores data in alphabetical order, in Walter’s user attributes the name “Jesse” entry comes before “Walter.” That’s why we see the wrong name displayed. It’s that simple.

Anyways, if this was an on-premise Lync server then there at least couple of fixes for this problem. Both fixes have to do with making changes on the server side. But this is Office 365, and we do not have access to the server-side. What are those of us living in the cloud supposed to do?! As far as I know, there is no fix, but there is a workaround. Instead of creating email address aliases using adsiedit.msc, you can:

  1. Create a distribution list in Office 365 management portal. Make sure to allow external senders send emails to this distribution list, so that emails don’t bounce back.
  2. Assign any email address aliases to that distribution list right from Office 365 management portal. For example, or
  3. Add an intended recipient(s) to the distribution list. For example, Now, when people send email to Jesse every email will be sent to Walter’s mailbox and everyone will see Walter as Walter when he signs into Lync. It’s a win-win.
  4. (Optional) Hide distribution list from Address Book, so your people don’t get confused when they search internal Global Address Book.

Well, it’s not exactly a fix, it’s a workaround and it will do for now. I do hope though that Microsoft will fix this bug in Office 365. Sometime in the next 20 minutes would be great. ;)

Removing Another User's Lock in TFS

One of the joys of distributed development teams is unexpected locks. In this particular case, the file was locked by a very distributed developer. And I needed to get it unlocked, as the lock was preventing a build from running. Oh, and I was using as the source control repository.

Step 1 – Determine the workspace

In order to perform an unlock/undo, you need to know the workspace and user involved. To find out the workspace for a user, there is a workspaces option for the tf command line prompt. So open up the Visual Studio Command Line window and navigate to your local mapped directory for the project. This navigation is important, as it allows you to minimize some of the command line options that we will be using.

Once you’re in the directory, execute the following command:

tf workspaces /owner;domain\userid

In this case (since we’re using, the domain\userid is actually the Live ID for the user that currently holds the lock. The output from this command includes the name of the workspace in question.

Step 2 – Undo pending changes (thus releasing the lock)

Another tf command is required for this step. In the same command line window, execute the following command:

tf undo itemspec /workspace:workspace;domain\userid

In this case, the itemspec is the path to the locked item (for example $/MyProject/Directory/fileName.txt), the workspace is the name of the workspace identified in step 2 and domain\userid is the login id (or Live ID in our case) of the person who owns the workspace (and who has the item checked out).

And voila. The lock is undone and I’m now free to wreck havoc…er…check in my code.

How to backup Azure databases

As we start using SQL Azure more and more for storing data, we had to come up with a easy and inexpensive way to backup Azure databases. There is a number of various tools available to backup Azure databases, but they usually require a separate install and they are never free. Although, sometimes, they are fairly inexpensive. I like free ones better though.

So, after a bit of research I have discovered an easy way to backup SQL Azure databases to my on-premise (offsite) SQL Server: SQL Data Sync. D’oh! This is an existing functionality in Azure, and it can be accessed through an “old” Windows Azure portal interface ( I am not going to write step-by-step instructions because, in this case, user interface is actually very intuitive and once you get to Data Sync part of Azure portal, you will know what to do. Good luck!

{ Ping me, if you need any help or have any questions about this article }

A Rose by Any Other Name

I’m just putting the finishing touches on a book (Professional Visual Studio 2012 from Wiley) and as a result, I’ve had to learn precisely what the names of the UI formerly known as Metro would be. And you can find out the ‘answer’ in the table below, just for future reference.

Old Term

New Term

Metro apps Windows Store apps
Metro design language Microsoft design style language
Metro style principles Microsoft design style principles

If Modern UI/design was your thing, any phrase involving “Modern” was also converted to “Microsoft design/style…”

None of these terms roll off the tongue at all (and I’ll still get caught calling it Metro when I’m speaking). But this is the word…at least for now.

How SharePoint farm locates its configuration database

Did you ever wonder how SharePoint farm locates its configuration database. It’s not in a web.config or in any other configuration files. It’s actually a lot simpler than it probably should have been. SharePoint actually stores the connection to the configuration database in the registry (see dsn key). 

SharePoint 2007: HKLM\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\12.0\Secure\ConfigDB

SharePoint 2010: HKLM\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\14.0\Secure\ConfigDB

SharePoint 2013: HKLM\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\15.0\Secure\ConfigDB (this one I have not actually checked, but I am betting it’s still there)

Scary, isn’t it?

Duplicate entries in your Office 365 address book

If you’re seeing duplicate entries in your (Office 365) Outlook address book, and you do not see any those duplicates in Office 365 Portal, then try the following:

  1. Download and install the latest copy of the Microsoft Online Service Module for PowerShell from
  2. Start a PowerShell session with Office 365 by running the following 5 commands (you will be prompted for your Office 365 credentials as these commands run):
  3. Import-Module MSOnline
    $O365Cred = Get-Credential
    $O365Session = New-PSSession –ConfigurationName Microsoft.Exchange -ConnectionUri -Credential $O365Cred -Authentication Basic -AllowRedirection
    Import-PSSession $O365Session
    Connect-MsolService –Credential $O365Cred

  4. Run the following command to check if you have anything in your Office 365 Recycle Bin:
    Get-MsolUser –ReturnDeletedUsers
  5. If the command above returns any Recycle Bin items, then run the following command to remove all items from Recycle Bin:
    Get-MsolUser –ReturnDeletedUsers | foreach { Remove-MsolUser -ObjectId $_.ObjectId -RemoveFromRecycleBin -Force }
  6. Run the command from step 4 again to make sure that your Office 365 Recycle Bin is empty

Getting Stuff Done

For whatever reason, the beginning of September has more of a “new year” feel to me than the beginning of January. Actually, I know the reason and it begins with four school age kids. :)

But regardless, I’ve been trying to focus for the last couple of weeks on execution and follow through. Trying to change just a little on how well I complete tasks, set priorities and just get stuff done. While I’m not going to suggest that any of what you’ll find below is original or mind-blowing, I believe that when it comes to self improvement, little steps are the way to go. So here is what I’ve been trying to improve, bit by bit.

Schedule it or forget it - We all live and die by our calendars. One of the joys of a smartphone is the ability to stay close to your schedule. I get a warm fuzzy knowing that I can quickly find out where I need to be next. And most of my unproductive moments are when things happen outside of this process. If you’re working on a team, make sure that you’re using a shared team calendar. Or at least, share your Outlook calendar so that others can see what you’re up to. Include team deadlines in this calendar. Put project update meetings (preferably regularly) into the calendar. Make them weekly to drive the pace of progress on the project

Prioritize to the nth degree - Do you have all the time you want to do everything that you want? No? Not surprisingly, you’re in the same boat as everyone else. So pick the stuff that matters and focus on that. If you choose to do only important tasks, you’ll find a couple of weird things happening. First, many of the non-important tasks will just take care of themselves. Second, you’ll feel like you’ve accomplish stuff during the day. And that’s a feeling worth cultivating.

Any decision is better than no decision – There is rarely an obviously best path to take. And even more rarely do we have all of the information that we need to choose that path. Reality is fuzzy. And there are times that testing and taking tentative steps is not possible. So just make the decision. A full-scale, all-in decision. Decisiveness is better than procrastination, because action leads to learning, which in turn creates better decisions. And if your on a team, it’s even more important. People are waiting to know where to go and what to do. The decision energizes the team and gets them moving towards their goal. 

Create new habits – While I was on vacation in August, I read a book on Habits. Habits of all kinds and by all people and organizations. Turns out that much of what we do over the course of the day is by habit. Just realizing how habits are formed and reinforced can help you chart a new course of action. By the way, the book is “The Power of Habit: Why We Do What We Do in Life and Business” and it’s well worth the read

These are just a start. But I made a decision, focused on small, but important steps and did what I thought was the most likely to see success. So far, so good.

Concatinating string in a batch file

Batch file? Yes you heard me, batch file, you know .cmd.

This is an odd post for me, however I needed to do this recently and don’t want to forget how I did it. Smile


When automating builds I often take advantage of invoke process. It’s a nice way to keep my build process generic and call out to a process that may change over time or is specific to branch I am building. It allows me to keep some of the process specific to the source code, by storing the batch file with the source code. This also allows me to version the batch file if it changes over time, and ensure it available to the build server without installing it on the build server.  None of this has anything to do with my blog post. Whatever the reason you may be calling out to a batch file using invoke process read on.

Have you ever had trouble passing variables like DropLocation or SourcesDirectory. If they have spaces in them you will need to put them in quotes. Lets say I want to pass the DropLocation to a batch file.

The arguments property of my invoke process would look like this. """" + DropLocation + """"

Notice the double quotes around the variable, to ensure they are passed to the batch file as one argument.

“\\My File Server\Drops\BuildDefinition_1.3.4636.1”

Without the quotes in this example my batch file would have received 3 arguments

  1. \\My
  2. File
  3. Server\Drops\BuildDefinition_1.3.4636.1

Therefore I need the Quotes.

In my batch file I want to access files in a subfolder of my DropLocation. So I need to concatenate the path passed in with another folder. Ultimately I want the location

\\My File Server\Drops\BuildDefinition_1.3.4636.1\MyFolder

The problem is with the double quote at the end of the argument this %1\MyFolder would end up being this “\\My File Server\Drops\BuildDefinition_1.3.4636.1”\MyFolder

Here is how to fix that problem.

First create a variable to hold your argument


Then when you go to use the variable use a little string manipulation. Specifically ( :~f,e ) where f = the number of characters to strip from the start of the argument and e the number of characters to strip from the end.

Therefore to get ride of the double quotes at the end of our argument just remove the last character.

Like this %DROPLOCATION:~0,-1%\MyFolder"

The above syntax will result in this “\\Server\Drops\BuildDefinition_1.3.4636.1\MyFolder” making it possible to add my folder in the batch file instead of in the build process.

What's New with SQL Server 2012?

Like any newly released version of software, Microsoft SQL Server 2012 brings a host of significant changes to Microsoft’s enterprise data platform. And, in this case, the changes are not only in the features but in the editions that are being offered as well. And there is a new certification model .And many performance, business intelligence and development enhancements. So take a look at the list of features and see which ones matter to you.

Data Quality Services – Clean data is happy data. And Data Quality Services (DQS) is a knowledge-based tool that helps ensure your databases contain high-quality, correct data. DQS performs data cleansing, which can modify or remove incorrect data. It also does data matching to identify duplicate data and profiling that intelligently analyzes data from different sources. Keeping your data happy has never been easier.

Changes to T-SQL – There are a number of T-SQL enhancements that have been introduced in SQL Server 2012. My personal favorite is TRY_CONVERT for data conversions. But that’s a personal preference. Others might be partial to OFFSET and FETCH (for data paging), a new FORMAT() function (data formatting made easier), and a new THROW operator (exception handing .NET style).

SQL Server Data Tools - SQL Server Data Tools uses the Visual Studio 2010 shell, and enables model-driven database development. This is on top of more traditional T-SQL and SQLCLR development and debugging. From a connectivity perspective, SQL Server Data Tools can connect to SQL Server 2005 and later as well as to SQL Azure. That last features makes it particularly useful, as I have found myself putting more and more database into the cloud.

Windows Server Core Support - Windows Server Core is designed to support server-based, infrastructure applications. We’re talking about applications such as SQL Server that provide back-end services but don’t need a GUI on the same server. Previously, SQL Server could not be run on Windows Server Core. This seemed to be ironic (and not in the Alanis Morrisette way) . SQL Server 2012′s support for Windows Server Core allows you to have a leaner and more efficient SQL Server installations, reduces potential attack surface and minimizes the need to apply non-relevant updates and patches.

These are not all of the features, but they are the ones that spring to mind for me. Included in SQL Server 2012 are features such as new SKUs (Enterprise, Business Intelligence, Standard and Express), contained database, a columnar index used to support high performance data warehousing scenarios, Power View and AlwaysOn Availability Groups.

Want to learn more? View the new SQL Server 2012 courses (and SQL Server 2008 courses) that are being offered at ObjectSharp right now!

Technorati Tags:

Revealing Module Pattern in CoffeeScript

My friend Colin introduced me to CoffeeScript last week through Mindscape’s Web Workbench.  As I started to learn CoffeeScript, the first thing I wanted to do is to hand translate some of my JavaScript files to see how the CoffeeScript compiler would convert them to JavaScript so I would end up with the functionally same code in the end.

I use the Revealing Module Pattern quite a bit. I found an implementation written by Burk Holland and in the comments of Burk’s post Steve Howell and Elijah Manor provided additional implementations.  Although, Burk’s example came close to what I was looking for in simplicity and cleanliness, I felt that there could be some improvements:

  • It defined the module in the global namespace.  I always create my modules in a namespace I define for the my project to further protect against naming collisions.
  • I didn’t like how the module ended with the definition of a variable to contain the  public functions.

I put forward the following:

I use the existential operator to use my project’s namespace if already defined or to create a new one.  Although, the || operator does produce nicer JavaScript output.  I’m sitting on the fence to which way to go.

I create an object as the last expression in the function that will be returned.  This skips the creation of the temporary variable that is never referenced. 

I’m pretty happy with the JavaScript that is produced.  It very closely matches how I implement the Revealing Module pattern in JavaScript, and it upholds the simplicity and cleanliness of CoffeeScript.