Azure AD sync errors for administrative user accounts

This is a follow up blog post to the "Insufficient access rights to perform the operation error in Azure AD Connect" blog post I did a little while back. The original blog post covered Azure AD sync errors. For the most part, anyways, because we kept seeing sync errors for specific users. After doing a bit of digging, we have learned that the users that kept getting sync errors belong to Active Directory Administrators or Domain Admins groups. This lead our troubleshooting journey to learning about Active Directory Protected Groups.

As it turns out, if a user is a member of Active Directory Administrators or Domain Admins groups, then Active Directory will overwrite any ACL changes that you make with predefined ACL template on a regular basis. So, if we make ACL changes like grant additional permissions on those accounts or enable permissions inheritance on those accounts in order to allow Azure AD Connector to update source anchor (ms-ds-consistencyGuid) attribute, this change will be overwritten by Active Directory, which brings us back to square one. To bypass this issue, you can do the following:

  • Change ACL template in Active Directory to include the changes you need like Replicate Directory Changes and Replicate Directory Changes All permissions for Azure AD Connector account and write permissions for source anchor (ms-ds-consistencyGuid) attribute. This would work, but in my opinion, it's a bit too drastic
  • Or, remove the users from Active Directory Administrators or Domain Admins groups, if you can.
  • Or, you can make the permissions changes on those accounts and immediately force Azure AD Connect sync using the following PowerShell command: Start-ADSyncSyncCycle -PolicyType Initial. Azure AD Connect should have enough time to write to source anchor attribute and complete the sync without errors. After the initial sync is complete, AD can reset ACL back on those account all it wants since we only need to write to source anchor attribute once.

I prefer the last option, since it's simple and it works. More information about AD protected groups can be found at https://technet.microsoft.com/en-us/library/2009.09.sdadminholder.aspx

Thanks for reading.

Insufficient access rights to perform the operation error in Azure AD Connect

If you're getting Insufficient access rights to perform the operation in your Azure AD Connect synchronization logs, do the following:

  • Make sure you have the latest version of Azure AD Connect installed: https://www.microsoft.com/en-us/download/details.aspx?id=47594
  • If you're syncing passwords, make sure that your sync service account has Replicate Directory Changes and Replicate Directory Changes All permissions in your on premises Active Directory
  • Make sure that your sync service account has write permissions on your sourceAnchor attribute (which is most likely set to ms-ds-consistencyGuid). You can do that either using the user interface, or PowerShell, which is easier:

    $accountName = "DOMAINNAME\USERNAME" #[this is the account that will be used by Azure AD Connect Sync to manage objects in the directory.

    $ForestDN = "DC=DOMAINNAME,DC=SOMETHING"

    $cmd = "dsacls '$ForestDN' /I:S /G '`"$accountName`":WP;ms-ds-consistencyGuid;user'"

    Invoke-Expression $cmd

  • Make sure that inheritance is turned on for the AD objects that get errors in the synchronization logs. To do that
    • Open Active directory Users and Computers
    • Enable the Advanced features in the View settings and,
    • Open up the user object that can't sync.
    • Go to the security tab and then into advanced
    • Check to make sure the box is checked to inherit permissions. But before you do that make sure that the enabling inheritance will not bring down some permissions that you do not want to be there

That's all. Thanks for reading.

VSTS/TFS task to execute SQL scripts

I have published VSTS/TFS build task that allows you to execute SQL script as a part of your build or release pipelines. What makes this task different from similar tasks in VSTS marketplace is that you can execute SQL script in the number of ways:

  • Execute SQL script from the agent using integrated Windows authentication
  • Execute SQL script from the agent using SQL authentication
  • Execute SQL script from the remote server using Windows authentication, which is handy when you have to do cross domain deployments and/or when you do not have SQL client tools installed on the agent
  • Execute SQL script from the remote server using SQL authentication, which is handy when you do not have SQL client tools installed on the agent

The tool is free and it works. You can download it from VSTS marketplace.

Please let me know if you have any comments or find any bugs. Thanks for reading.

Check time elapsed and time remaining for running SQL job

Just thought I'd share this handy SQL statement that allows you to check for the time elapsed and time remaining for the running SQL job. It would help you to get a better idea how much more time you'd have to wait for the script to complete. Anyway, here is a script that checks time elapsed and time remaining for the running backup SQL job

SELECT db_name(database_id), command, percent_complete, 'elapsed' = total_elapsed_time / 60000.0, 'remaining' = estimated_completion_time / 60000.0

FROM sys.dm_exec_requests

WHERE command like 'BACK%'

 

Docker for Windows Server 2016 requires update KB3176936

If you try to play with Docker and containers on new Windows Server 2016, you do the following:

  1. Add Containers feature using Server Manager, which is easy to do
  2. Then you open PowerShell with elevated privileges
  3. First install the OneGet PowerShell module:
    Install-Module -Name DockerMsftProvider -Repository PSGallery -Force
  4. Next you use OneGet to install the latest version of Docker.
    Install-Package -Name docker -ProviderName DockerMsftProvider

At this point, you get an error that "The docker installer requires update KB3176936". What is that about?!? Weird. What you need to do is to run sconfig, then choose option 6 and then A and A to install all updates. It might take a few minutes to download and install updates, so be patient.

This works for Server 2016 in no-desktop installs as well as with the UI.

  1. After this you need to restart your server using UI or using Restart-Computer -Force command.
  2. No you're ready to try to use OneGet to install the latest version of Docker.
    Install-Package -Name docker -ProviderName DockerMsftProvider
  3. After this you need to restart your server using UI or using Restart-Computer -Force command.

Now, you're ready to use Docker. For example, download a pre-created .NET sample image from the Docker Hub registry and deploy a simple container running a .Net Hello World application by running the following command:

docker run microsoft/sample-dotnet

This command will use Docker run to deploy the .Net container. It will take a few minutes to download the container image though.

Update Portal Settings for multiple team projects

There is a great Codeplx project that allows you to update Portal Settings for multiple team projects at the same time: https://features4tfs.codeplex.com/. Whenever you upgrade TFS, and have to re-enable and re-point team projects in your upgraded TFS project collections. If you have a lot of team projects to update, it becomes very tedious task. Feature4TFS tool allows you to update all of those items in one step. Very handy tool. The only issue with this tool is that it does not work with TFS 2015 Update 3 (only Update 2 or earlier.

I have downloaded the source code and updated the solution to make it work with TSF 2015 Update 3. You can download updated solution from https://1drv.ms/f/s!AoNW-kvNWJ9ygsNWPA5LHTJ5fYREXQ. There two files: binaries only and source code and binaries.

Getting output from Remote PowerShell on Target Machines

Remote PowerShell on Target Machines task in TFS/VSTS is awesome. Easy to use and it works. The only things that bugs me about that task is that I don't see the output of the script on the build console output. Luckily, this can be easily fixed. All you have to do that is to use Write-Verbose -verbose instead of Write-Output command inside your PowerShell script.

For example, instead of:

Write-Output "Remote script is executing step 1-2-3"

use the following

Write-Verbose "Remote script is executing step 1-2-3" -verbose

You should now see the remote script output in build console.

Persisting PowerShell variables between steps

In VSTS and in TFS 2015 Update 1 (and higher), you can use the following command to persist PowerShell variables between steps by simply writing it to standard out (Write-Host) in powershell

Write-Host ("##vso[task.setvariable variable=testvar;]testvalue")

 

This will make the variable available to be consumed by the later steps in an input via the $(testvar) syntax and the replacement will happen as expected. It's that easy.

You can see the full list of the commands you can use from the scripts at https://github.com/Microsoft/vsts-tasks/blob/master/docs/authoring/commands.md

Customize TFS build steps on the fly based on build trigger

I have the following scenario. The client currently has multiple build definitions, which are very similar (not identical, but similar), but have different triggers. For example, nightly builds run more/different tests, while gated or CI builds run less tests and perhaps run less steps. And, they want to combine/merge those build definitions into one. With TFS 2015 you can create multiple triggers on the same build definitions, so we should be able to merge those build definitions into one. The problem is that the build steps must be the same, or we need to figure out the way to update those steps on the fly somehow. Having the same steps is not an option, since we want gated/CI builds to be smaller/faster. So, we need to figure out a way to modify build steps on the fly. Somehow. PowerShell to the rescue.

I wrote a small PowerShell script that checks if the running build is gated build and update certain build variables (in my case, I used a variable called TestFilterCriteriaToSet) in memory, which are later consumed by next steps in the build definition.

 

function Is-This-Gated-Build ($buildid)

{

    $buildDetails=Invoke-RestMethod -Uri "http://TFSSERVER:8080/tfs/TFSCOLLECTION/TEAMPROJECT/_apis/build/builds/$($buildid)?sapi-version=2.0" -UseDefaultCredentials -Method Get

    $buildReason = $buildDetails.reason

    if ($buildReason -eq 'CheckInShelveset')

    {

        return $true

    }

    else

    {

        return $false

    }

}

 

if (!$gatedBuild)

{

Write-Output "Running non-gated build..."

    

# DO SOME POWERSHELL MAGIC HERE

 

    $TestFilterCriteriaToSet = "(Name!=TestProcessQueue)"

    Write-Host ("##vso[task.setvariable variable=TestFilterCriteria;]$TestFilterCriteriaToSet")

}

else

{

    Write-Output "Running gated build..."

    $TestFilterCriteriaToSet = "(TestCategory!=LongRunning & TestCategory!=Commit & TestCategory!=IntermittentFailure & TestCategory!=ExecutableLaunch)"

    Write-Host ("##vso[task.setvariable variable=TestFilterCriteria;]$TestFilterCriteriaToSet")

}

 

Next steps: Save the script. Check it in. Add this script as a step in the build definition. Consume TestFilterCriteriaToSet variable in the next step as such $( TestFilterCriteriaToSet). For example, in Visual Studio Test step. That's it.

Downgrade SQL Server database from Enterprise to Standard Edition

I've seen quite a few clients who have SQL Server Enterprise Edition installed, but even though they don't use any of the enterprise features of SQL Server. For example, TFS does not require SQL Server Enterprise Edition, unless of course you need encryption or compression or SQL clusters. It works just fine with SQL Server Standard Edition. So, if you're not using enterprise features, having SQL Server Enterprise Edition installed seems like an overkill. An expensive overkill. Fortunately, this can be fixed. If you need to downgrade SQL Server database from Enterprise Edition to Standard Edition.

First, let's check what edition of SQL Server you're running. To do that open SQL Server Management Studio, connect to your SQL Server and create/run new SQL query:

SELECT @@version

Let's assume that you're running SQL Server Enterprise Edition. To downgrade database from Enterprise Edition to Standard Edition, we need to disable compression on that database. Before we do that we need to make sure we do the following:

  • BACKUP your database because you know… it's smart thing to do
  • Make sure that your SQL Server has enough disk space available to accommodate larger database sizes. Remember that uncompressing a database might/will increase the size of the database.

To disable compression, create/run the following script against the database where you want to disable compression.

SELECT DISTINCT 'ALTER TABLE [' + SCHEMA_NAME(schema_id) + '].[' + NAME + '] REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = NONE);'

FROM sys.partitions p

join sys.objects o

on p.object_id = o.object_id

WHERE o.TYPE = 'u'

and data_compression_desc != 'NONE'

UNION

SELECT 'ALTER INDEX ALL ON [' + SCHEMA_NAME(schema_id) + '].[' + NAME + '] REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = NONE);'

FROM sys.partitions p

join sys.objects o

on p.object_id = o.object_id

WHERE o.TYPE = 'u'

and data_compression_desc != 'NONE'

   

If any of the tables in the database have compression enabled, this script will generate more SQL scripts as an output. When that happens, copy the generated SQL scripts and execute those scripts against the same database to disable compression. Do that for all the databases that have compression enabled. That's it.