Olav Aukan Getting information off the Internet is like taking a drink from a fire hydrant…

30Mar/11

How To Backup And Restore IIS 7 Metabase With PowerShell

As I mentioned in my last blog post I've been playing around with PowerShell for SharePoint automation for a while now. One of the problems with SharePoint is it's complicated architecture that makes a seemingly simple task - backing up the system - incredibly complicated. The complexity often comes from the fact that there are so many options to choose from. There are SQL backups of the content databases, Stsadm.exe backups of whole or parts of the farm, VMWare snapshots of the server (which by the way is not supported by Microsoft), file backups of the 12 hive, solution file backups, web.config backups, IIS backups, third party backup solutions that do some or all of these things etc. etc. etc.

This blog post is not going to solve those issues for you, but it will at least give you one part of the puzzle: how to backup and restore the IIS metabase with PowerShell.

# This function performs a backup of the IIS Metabase
function SP-Backup-IIS {

	param (
		[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
		[string]
		$BackupFolder		
	)
	
	process {
	
		UI-PrintAction "Attempting to backup the Internet Information Services Metabase"
		
		Write-Host "Checking IIS version..."
		Write-Host
		
		$IISVersion = (Get-ItemProperty HKLM:\SOFTWARE\Microsoft\InetStp | select SetupString, *Version*)
		
		Write-Host "IIS Version:"
		$IISVersion
		Write-Host
		
		$ComputerName = Get-Content Env:Computername
		$Time = [DateTime]::Now.ToString().Replace(":",".").Replace(" ","_")
		$BackupName = "IIS-Metadata-$ComputerName-$Time"
		$BackupFileName = "$BackupName.zip"
		
	    switch ($IISVersion.MajorVersion) 
		{
			# IIS 7
			7 {
				$BackupFilePath = Join-Path -Path $BackupFolder -ChildPath $BackupFileName

				Write-Host "Starting backup of IIS metadata."
				Write-Host
				
				#Perform backup
				& $env:windir\system32\inetsrv\appcmd.exe add backup $BackupName				
				
				#Zip and copy backup to destination
				$IISRootBackupFolder = Get-Item $env:windir\system32\inetsrv\backup
				$IISBackupFolder = Join-Path -Path $IISRootBackupFolder -ChildPath $BackupName								
				Zip-Compress-Folder $IISBackupFolder $BackupFilePath
												
				Write-Host "Backup Complete"
			}
			
			# Other version of IIS
			default { Write-Host "This version of IIS is not supported in this script." -ForegroundColor Yellow }
		}
		Write-Host
	}
}

# This function restores a backup of the IIS Metabase
function SP-Restore-IIS {

	param (
		[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
		[string]
		$BackupName		
	)
	
	process {
	
		UI-PrintAction "Attempting to restore the Internet Information Services Metabase"
		
		Write-Host "Checking IIS version..."
		Write-Host
		
		$IISVersion = (Get-ItemProperty HKLM:\SOFTWARE\Microsoft\InetStp | select SetupString, *Version*)
		
		Write-Host "IIS Version:"
		$IISVersion
		Write-Host
				
	    switch ($IISVersion.MajorVersion) 
		{
			# IIS 7
			7 {

				Write-Host "Starting restore of IIS metadata."
				Write-Host
				
				#Perform restore
				& $env:windir\system32\inetsrv\appcmd.exe restore backup $BackupName				
								
				Write-Host
				Write-Host "Restore Complete"
			}
			
			#Other version of IIS
			default { Write-Host "This version of IIS is not supported in this script." -ForegroundColor Yellow }
		}
		Write-Host
	}
}


# This function zips the contents of a folder
function Zip-Compress-Folder { 

	param (
		[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
		[string]
		$SourceFolder,		

		[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=1)]
		[string]
		$ZipFileName		
	)
	
	process {
		
		Write-Host
		Write-Host "Creating new zip file."
		Write-Host
		Write-Host "Source: $SourceFolder"
		Write-Host "Destination: $ZipFileName"
		Write-Host
		
		$ParentFolder = (Get-Item $SourceFolder).Parent.FullName
		
		# Check if file exists
		if (test-path $ZipFileName) { 
		  Write-Host "The file $ZipFileName already exists." -ForegroundColor Yellow
		  return
		} 
		
		# Create zip file		
		Set-Content $ZipFileName ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
		(Dir $ZipFileName).IsReadOnly = $false 
		$ZipFile = (New-Object -com shell.application).NameSpace($ZipFileName) 
		
		Write-Host "Adding $SourceFolder to the zip file."
		Write-Host
				
		$ZipFile.CopyHere((Get-Item $SourceFolder).FullName)
		
		Write-Host "Zip file created."
		Write-Host
	}
}

SP-Backup-IIS will backup the IIS metabase, zip the backup folder and copy it to the location you specify in the $BackupFolder parameter. The SP-Restore-IIS will restore the IIS metabase from the backup you specify in the $BackupName parameter. This parameter corresponds to the name of the backup folder in 'C:\Windows\System32\inetsrv\backup'. You could expand on this by having the restore script first get your zip file and extract it to the backup folder. Happy scripting!

Filed under: Uncategorized 24 Comments
21Mar/11

How to backup SharePoint using PowerShell

Lately I've been reading up on - and experimenting with - PowerShell to automate alot of the tings I do in SharePoint. The original motivation was a deployment gone bad (ie. too many manual steps + too little time = too many errors) and it got me rethinking my whole approach to managing SharePoint.

My previous attempts at automating the build -> package -> deploy process with a .bat file calling MSBuild and STSADM commands had failed miserably about two years ago. It would not wait for the solution to finish retracting before trying to remove it, or it would try to activate a feature before the solution was finished deploying, etc. Also, since it was one giant monolithic script, any errors early on in the process would cause all sorts of problems.

There are ways to deal with this in .bat files, but they don't even come close to the cool stuff you can do with PowerShell! Therefore I'm planning on writing a couple of posts about using PowerShell to manage SharePoint based on the things I've been trying out so far. Keep in mind that I'm still learning and some of the stuff I write about might be stupid, inefficient or downright wrong. With that disclaimer out of the way I present my first PowerShell script: Performing a full farm backup.

# This function performs a complete backup of the local farm
function SP-Backup-Farm {

	param (
		[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
		[string]
		$BackupFolder
	)

	process {

		Write-Host "Attempting full backup of the farm."

		# Create the backup settings
		$Settings = [Microsoft.SharePoint.Administration.Backup.SPBackupRestoreSettings]::GetBackupSettings($BackupFolder, "Full");

		# Set optional operation parameters
		$Settings.IsVerbose = $true;
		$Settings.UpdateProgress = 10;
		$Settings.BackupThreads = 10;

		# File size details
		$BackupSize = New-Object UInt64
		$DiskSize = New-Object UInt64
		$DiskFreeSize = New-Object UInt64

		Write-Host "Backup Location:" $BackupFolder

		# Check that the target folder exists
		if (Test-Path $BackupFolder)
		{
			Write-Host "Backup Location Exists: True"
			Write-Host

			# Backup operation details
			$BackupID = [Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::CreateBackupRestore($Settings);
			$BackupObjects = [Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::FindItems($BackupID, "Farm");

			# Get file size info
			$BackupSize = [Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::DiskSizeRequired($BackupID)
			[void][Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::DiskSize($BackupFolder, [ref]$DiskFreeSize, [ref]$DiskSize)

			# Check if there is enough free disk space
			$HasEnoughSpace = $false
			if ($DiskFreeSize -gt $BackupSize)
			{
				$HasEnoughSpace = $true
			}

			$BackupSizeString = Util-Convert-FileSizeToString $BackupSize
			$DiskSizeString = Util-Convert-FileSizeToString $DiskSize
			$DiskFreeSizeString = Util-Convert-FileSizeToString $DiskFreeSize

			Write-Host "Total Disk Space:" $DiskSizeString
			Write-Host "Free Disk Space:" $DiskFreeSizeString
			Write-Host "Required Disk Space:" $BackupSizeString
			Write-Host

			if($HasEnoughSpace)
			{
				Write-Host "Sufficient Free Disk Space: True"

				# Set the backup as the active job and run it
				if ([Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::SetActive($BackupID))
				{
					$BackupObjectCount = $BackupObjects.Count

					Write-Host "Successfully set backup job as the active job."
					Write-Host "Backup consists of $BackupObjectCount object(s)"
					Write-Host
					Write-Host "Backup Started"
					Write-Host

					foreach($BackupObject in $BackupObjects)
					{
						if (([Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::Run($BackupID, $BackupObject)))
						{
							Write-Host "Backup Completed"
						}
						else
						{
							Write-host "An unexpected error occured!" -ForegroundColor Yellow
							Write-Host "Backup Failed" -ForegroundColor Yellow
						}
					}
				}
				else
				{
					Write-Host "Unable to set backup job as the active job." -ForegroundColor Yellow
					Write-Host "Backup Failed." -ForegroundColor Yellow
				}
			}
			else
			{
				Write-Host "Sufficient Free Disk Space: False" -ForegroundColor Yellow
				Write-Host "Backup Failed" -ForegroundColor Yellow
			}
		}
		else
		{
			Write-Host "Backup Location Exists: False" -ForegroundColor Yellow
			Write-Host "Backup folder doesn't exist or the service account does not have read/write access to it." -ForegroundColor Yellow
			Write-Host "Backup Failed." -ForegroundColor Yellow
		}

		Write-Host

		# Clean up the operation
		if (!$BackupID -eq $null)
		{
			[void][Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::Remove($BackupID)
		}
	}
}

# This function returns a "user friendly" display value for a filesize in bytes
function Util-Convert-FileSizeToString {

    param (
		[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
		[int64]
		$sizeInBytes
	)

    switch ($sizeInBytes)
    {
        {$sizeInBytes -ge 1TB} {"{0:n$sigDigits}" -f ($sizeInBytes/1TB) + " TB" ; break}
        {$sizeInBytes -ge 1GB} {"{0:n$sigDigits}" -f ($sizeInBytes/1GB) + " GB" ; break}
        {$sizeInBytes -ge 1MB} {"{0:n$sigDigits}" -f ($sizeInBytes/1MB) + " MB" ; break}
        {$sizeInBytes -ge 1KB} {"{0:n$sigDigits}" -f ($sizeInBytes/1KB) + " KB" ; break}
        Default { "{0:n$sigDigits}" -f $sizeInBytes + " Bytes" }
    }
}

The convert bytes to string function was something I found on another blog and adapted to PowerShell, so I can't really take credit for that one. Also it took about 3 hours to do a full backup on my VMWare machine with about 50GB of content databases. Your milage may vary...

21Mar/11

How to ghost an unghosted content type

A while back I wrote a post about content types becoming unghosted when you edit them in the UI and there not being any way of reversing this without directly modifying the content database. That would put you in an unsupported state, so obviously it's not an option for most people. But as it turns out there is a supported way of doing this after all, it's just not very well documented...

A colleague of mine was having this same problem and informed me that he was able to fix it with the STSADM Deactivatefeature operation by adding the -force parameter.

Let us see what the Holy Gospel according to TechNet has to say about the -force parameter:

"force - Forces the feature to be uninstalled."

In other words, not much help to be found in the scriptures. But it works! When the feature containing the unghosted content type(s) is deactivated with the -force parameter the Definition column in the ContentTypes table goes back to NULL, meaning it is now using the XML definition again.

You can then use the gl-propagatecontenttype custom STSADM command from Gary Lapointe with the -updatefields parameter to push changes in the site content type down to the list content type.

24Nov/10

400 Bad Request (Header Field Too Long) when using Kerberos authentication

A client was having a problem with his SharePoint installation a while back that really confused me at first. Some users were unable to access their SharePoint 2007 intranet after Kerberos authentication had been configured. Instead of being logged in automatically as expected they received a nize "This page cannot de displayed" in Internet Explorer. The error returned by IIS was "440 Bad Request (Header Field Too Long)".

After some research I stumbled upon this blog post: HTTP/1.1 400 Bad Request (Header Field Too Long) that pointed me to KB-820129. Basically the problem boils down to the difference in the way NTML and Kerberos authentication is performed. With NTLM authentication the client basically sends his username and password to the server which then checks the users memberships by looking up the user in Active Directory. With Kerberos authentication the client basically gets this information from Active Directory himself sends a "token" to the server that contains information about the users memberships. The more AD groups that user is a member of, the bigger the token, and at some point it can become so large that IIS rejects the whole request.

This explained why only some users were having problems, as we discovered that the affected users had dozens of AD memberships, and those AD groups were nested inside other AD groups etc. The solution for us was to modify the following registry keys on all the SharePoint web front end servers in the farm:

HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\HTTP\Parameters\MaxFieldLength = 65534
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\HTTP\Parameters\MaxRequestBytes = 65534

After the registry modifications have been done the HTTP service and all related IIS services will have to be restarted, as described in the bottom of KB-820129:

To restart the HTTP service, type and all related IIS services, follow these steps:

  1. Click Start, click Run, type Cmd and then click OK.
  2. At the command prompt, type net stop http at a command prompt and then press ENTER.
  3. At the command prompt, type net start http at a command prompt and then press ENTER.
  4. At the command prompt, type net stop iisadmin /y at a command prompt and then press ENTER.

    Note: Any IIS services that depend on the IIS Admin Service service will also be stopped. Notice the IIS services that are stopped when you stop the IIS Admin Service service. You will restart each service in the next step.

  5. Restart the IIS services that were stopped in step 4. To do this, type net start servicename at the command prompt and then press ENTER. In the command, servicename is the name of the service that you want to restart. For example, to restart the World Wide Web Publishing Service service, type net start "World Wide Web Publishing Service", and then press ENTER.
12Oct/10

Content types can be unghosted too!

Update: There is a simple solution to this problem after all. See How to ghost an unghosted content type for more details.

Just about all SharePoint projects require that you create some custom content types and/or add some custom site columns, and this is usually done through a feature that contains the CAML definition for those content types and site columns, as described in the MSDN article Content Type Deployment Using Features. It is also often the case that for some reason these CAML definitions will change during the project. Maybe a certain feature was not specified correctly, the requirements changed, or additional features are added that require new columns to an already existing content type. This is where the first problems start to arise...

If you read the MSDN article about deploying content types you will see the following warning:

"Under no circumstances should you update the content type definition file for a content type after you have installed and activated that content type. Windows SharePoint Services does not track changes made to the content type definition file. Therefore, you have no method for pushing down changes made to site content types to the child content types."

This warning is usually ignored while in development, as clean builds are done regularly where everything is deleted and the solution is deployed to a fresh site collection. The problem usually doesn't show up until after the solution has gone into production and a later update tries to make some changes to these content type definitons. These changes will be pushed to the site content types but the list content types are basically detached copies of the site content types and therefore will not be updated.

Luckily there is a hack solution to this problem, and it comes in the form of a custom STSADM command. This command will push changes to the site content type down to all list content type instances, thus solving the problem once and for all. Or perhaps not? I stumbled upon this command while trying to figure out how to do just such an update, and after trying it out with success in our test environment I decided to go ahead and use the same command in the production environment. Sadly it did not work, and I could not figure out why.

Then I read a post on our internal blog by Jan Inge Dalsbø with a title of something like "Mysteries of SharePoint content types" where he goes on to explain that site content types can actually be unghosted (detached from their CAML definition) if they are edited in the SharePoint UI. Any changes to the CAML definition done after this has happened will not be pushed to the site content type, and you are - pardon my french - properly fucked! The content type will have to be updated through the UI or object model from that point on.

To verify that the content type has in fact been unghosted you need to check the content database. To do this you will have to run a query against it (obviously) and this will actually leave you in an unsupported state! Yep, we all knew you weren't supposed to modify any of the SharePoint databases but as it turns out you're not even allowed to read from it.  That being said, here are two before and after screenshots of a content type being unghosted.

Notice that in the second screenshot the definition field has gone from being NULL to containing the content type CAML definiton. The content type has now been detached from its definition, updates will fail, hairs will be pulled, gods will be cursed etc. etc. I haven't dared to poke around in the production database to see if this is actually what happened in my case, but I'm pretty certain it's the same problem.

So where do you go from here? I honestly don't know... Søren Nielsen, the original author of the code used in the STSADM command mentioned earlier, has a  hack solution to this problem as well, but it involves modifying the content database. In his blog post Convert “virtual” content types to “physical” he explains the steps needed to reattach the content type to its CAML definition. Keep in mind that this will leave you in an unsupported state!

This was all new to me, and I was a bit shocked to find out that this is how it works. Basically Microsoft has given us a big fat rope to hang ourselves with here. From now on I will definitely tell all site collection administrators to never touch the content types!