Olav Aukan Getting information off the Internet is like taking a drink from a fire hydrant…


How to backup SharePoint using PowerShell

Lately I've been reading up on - and experimenting with - PowerShell to automate alot of the tings I do in SharePoint. The original motivation was a deployment gone bad (ie. too many manual steps + too little time = too many errors) and it got me rethinking my whole approach to managing SharePoint.

My previous attempts at automating the build -> package -> deploy process with a .bat file calling MSBuild and STSADM commands had failed miserably about two years ago. It would not wait for the solution to finish retracting before trying to remove it, or it would try to activate a feature before the solution was finished deploying, etc. Also, since it was one giant monolithic script, any errors early on in the process would cause all sorts of problems.

There are ways to deal with this in .bat files, but they don't even come close to the cool stuff you can do with PowerShell! Therefore I'm planning on writing a couple of posts about using PowerShell to manage SharePoint based on the things I've been trying out so far. Keep in mind that I'm still learning and some of the stuff I write about might be stupid, inefficient or downright wrong. With that disclaimer out of the way I present my first PowerShell script: Performing a full farm backup.

# This function performs a complete backup of the local farm
function SP-Backup-Farm {

	param (
		[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]

	process {

		Write-Host "Attempting full backup of the farm."

		# Create the backup settings
		$Settings = [Microsoft.SharePoint.Administration.Backup.SPBackupRestoreSettings]::GetBackupSettings($BackupFolder, "Full");

		# Set optional operation parameters
		$Settings.IsVerbose = $true;
		$Settings.UpdateProgress = 10;
		$Settings.BackupThreads = 10;

		# File size details
		$BackupSize = New-Object UInt64
		$DiskSize = New-Object UInt64
		$DiskFreeSize = New-Object UInt64

		Write-Host "Backup Location:" $BackupFolder

		# Check that the target folder exists
		if (Test-Path $BackupFolder)
			Write-Host "Backup Location Exists: True"

			# Backup operation details
			$BackupID = [Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::CreateBackupRestore($Settings);
			$BackupObjects = [Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::FindItems($BackupID, "Farm");

			# Get file size info
			$BackupSize = [Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::DiskSizeRequired($BackupID)
			[void][Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::DiskSize($BackupFolder, [ref]$DiskFreeSize, [ref]$DiskSize)

			# Check if there is enough free disk space
			$HasEnoughSpace = $false
			if ($DiskFreeSize -gt $BackupSize)
				$HasEnoughSpace = $true

			$BackupSizeString = Util-Convert-FileSizeToString $BackupSize
			$DiskSizeString = Util-Convert-FileSizeToString $DiskSize
			$DiskFreeSizeString = Util-Convert-FileSizeToString $DiskFreeSize

			Write-Host "Total Disk Space:" $DiskSizeString
			Write-Host "Free Disk Space:" $DiskFreeSizeString
			Write-Host "Required Disk Space:" $BackupSizeString

				Write-Host "Sufficient Free Disk Space: True"

				# Set the backup as the active job and run it
				if ([Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::SetActive($BackupID))
					$BackupObjectCount = $BackupObjects.Count

					Write-Host "Successfully set backup job as the active job."
					Write-Host "Backup consists of $BackupObjectCount object(s)"
					Write-Host "Backup Started"

					foreach($BackupObject in $BackupObjects)
						if (([Microsoft.SharePoint.Administration.Backup.SPBackupRestoreConsole]::Run($BackupID, $BackupObject)))
							Write-Host "Backup Completed"
							Write-host "An unexpected error occured!" -ForegroundColor Yellow
							Write-Host "Backup Failed" -ForegroundColor Yellow
					Write-Host "Unable to set backup job as the active job." -ForegroundColor Yellow
					Write-Host "Backup Failed." -ForegroundColor Yellow
				Write-Host "Sufficient Free Disk Space: False" -ForegroundColor Yellow
				Write-Host "Backup Failed" -ForegroundColor Yellow
			Write-Host "Backup Location Exists: False" -ForegroundColor Yellow
			Write-Host "Backup folder doesn't exist or the service account does not have read/write access to it." -ForegroundColor Yellow
			Write-Host "Backup Failed." -ForegroundColor Yellow


		# Clean up the operation
		if (!$BackupID -eq $null)

# This function returns a "user friendly" display value for a filesize in bytes
function Util-Convert-FileSizeToString {

    param (
		[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]

    switch ($sizeInBytes)
        {$sizeInBytes -ge 1TB} {"{0:n$sigDigits}" -f ($sizeInBytes/1TB) + " TB" ; break}
        {$sizeInBytes -ge 1GB} {"{0:n$sigDigits}" -f ($sizeInBytes/1GB) + " GB" ; break}
        {$sizeInBytes -ge 1MB} {"{0:n$sigDigits}" -f ($sizeInBytes/1MB) + " MB" ; break}
        {$sizeInBytes -ge 1KB} {"{0:n$sigDigits}" -f ($sizeInBytes/1KB) + " KB" ; break}
        Default { "{0:n$sigDigits}" -f $sizeInBytes + " Bytes" }

The convert bytes to string function was something I found on another blog and adapted to PowerShell, so I can't really take credit for that one. Also it took about 3 hours to do a full backup on my VMWare machine with about 50GB of content databases. Your milage may vary...


How to ghost an unghosted content type

A while back I wrote a post about content types becoming unghosted when you edit them in the UI and there not being any way of reversing this without directly modifying the content database. That would put you in an unsupported state, so obviously it's not an option for most people. But as it turns out there is a supported way of doing this after all, it's just not very well documented...

A colleague of mine was having this same problem and informed me that he was able to fix it with the STSADM Deactivatefeature operation by adding the -force parameter.

Let us see what the Holy Gospel according to TechNet has to say about the -force parameter:

"force - Forces the feature to be uninstalled."

In other words, not much help to be found in the scriptures. But it works! When the feature containing the unghosted content type(s) is deactivated with the -force parameter the Definition column in the ContentTypes table goes back to NULL, meaning it is now using the XML definition again.

You can then use the gl-propagatecontenttype custom STSADM command from Gary Lapointe with the -updatefields parameter to push changes in the site content type down to the list content type.


Content types can be unghosted too!

Update: There is a simple solution to this problem after all. See How to ghost an unghosted content type for more details.

Just about all SharePoint projects require that you create some custom content types and/or add some custom site columns, and this is usually done through a feature that contains the CAML definition for those content types and site columns, as described in the MSDN article Content Type Deployment Using Features. It is also often the case that for some reason these CAML definitions will change during the project. Maybe a certain feature was not specified correctly, the requirements changed, or additional features are added that require new columns to an already existing content type. This is where the first problems start to arise...

If you read the MSDN article about deploying content types you will see the following warning:

"Under no circumstances should you update the content type definition file for a content type after you have installed and activated that content type. Windows SharePoint Services does not track changes made to the content type definition file. Therefore, you have no method for pushing down changes made to site content types to the child content types."

This warning is usually ignored while in development, as clean builds are done regularly where everything is deleted and the solution is deployed to a fresh site collection. The problem usually doesn't show up until after the solution has gone into production and a later update tries to make some changes to these content type definitons. These changes will be pushed to the site content types but the list content types are basically detached copies of the site content types and therefore will not be updated.

Luckily there is a hack solution to this problem, and it comes in the form of a custom STSADM command. This command will push changes to the site content type down to all list content type instances, thus solving the problem once and for all. Or perhaps not? I stumbled upon this command while trying to figure out how to do just such an update, and after trying it out with success in our test environment I decided to go ahead and use the same command in the production environment. Sadly it did not work, and I could not figure out why.

Then I read a post on our internal blog by Jan Inge Dalsbø with a title of something like "Mysteries of SharePoint content types" where he goes on to explain that site content types can actually be unghosted (detached from their CAML definition) if they are edited in the SharePoint UI. Any changes to the CAML definition done after this has happened will not be pushed to the site content type, and you are - pardon my french - properly fucked! The content type will have to be updated through the UI or object model from that point on.

To verify that the content type has in fact been unghosted you need to check the content database. To do this you will have to run a query against it (obviously) and this will actually leave you in an unsupported state! Yep, we all knew you weren't supposed to modify any of the SharePoint databases but as it turns out you're not even allowed to read from it.  That being said, here are two before and after screenshots of a content type being unghosted.

Notice that in the second screenshot the definition field has gone from being NULL to containing the content type CAML definiton. The content type has now been detached from its definition, updates will fail, hairs will be pulled, gods will be cursed etc. etc. I haven't dared to poke around in the production database to see if this is actually what happened in my case, but I'm pretty certain it's the same problem.

So where do you go from here? I honestly don't know... Søren Nielsen, the original author of the code used in the STSADM command mentioned earlier, has a  hack solution to this problem as well, but it involves modifying the content database. In his blog post Convert “virtual” content types to “physical” he explains the steps needed to reattach the content type to its CAML definition. Keep in mind that this will leave you in an unsupported state!

This was all new to me, and I was a bit shocked to find out that this is how it works. Basically Microsoft has given us a big fat rope to hang ourselves with here. From now on I will definitely tell all site collection administrators to never touch the content types!


Error when uploading large files to SharePoint running on Windows Server 2008

A client was having problems uploading a 29 MB file to their SharePoint server. Internet Explorer would simply show the "This page can not be displayed" error page, so I figured the maximum upload size was probably set too low.  There's two ways to set this limit; through SharePoint Central Administration or with STSADM.

Using SharePoint Central Administration

  1. Click Start -> All Programs -> Administrative Tools -> SharePoint Central Administration
  2. Click Application Management -> Web application general settings
  3. On the Web Application General Settings page, click the Web application that you want to change.
  4. On Maximum upload size, type the maximum file size in megabytes that you want, and then click OK.

You can specify a maximum file size of 2047 MB.


To set the max upload size to 100 MB use the following command:

  1. Click Start -> Run
  2. Type 'cmd' and click OK.
  3. Type 'stsadm -o setproperty -pn max-file-post-size -pv 100 -url http://server' and hit Enter.

However in the client's case the maximum upload size was already set to 100 MB, so clearly this was not the cause of the error. I thought maybe IIS was timing out, but realized that this couldn't be the problem either since the error happened the second the user clicked the Upload button and not 120 seconds later as would be expected with a time out. The default timeout value is 120 seconds, and I confirmed that this had not been changed.

It turns out this is an issue related to Windows Server 2008 when uploading files bigger than 28 MB as described in the bottom of KB925083. The solution is to add the following snippet of XML to the web application's web.config file after the <configSections> tag.

      <requestLimits maxAllowedContentLength="104857600"/>

Replace the actual value with the max upload size for you web application in bytes (1 Megabyte = 1048576 Bytes ).