Issue starting User Code Service on SharePoint 2010

Today I started the “User Code Service” of SharePoint 2010 to enable execution of Sandboxed Solutions.

Before activation I’ve created a new domain account for the service (*), created a managed account for the new domain account and changed the service credentials in the Central Administration

(CA => Security => Configure service accounts =>


(*) This is the important step to run into the issue.


After starting the Service on a couple of machines (“Services on Server” page on CA => System Settings) I tried to upload and activate my Sandboxed Solution.

But without success:


(“No available sandboxed code execution server could be found.”)


The Problem: The corresponding Windows service “SharePoint 2010 User Code Host” A.K.A “SPUserCodeV4” was not running but configured for the right account:



After starting it manually I got this error(s) in the ULS log:


[Some BINGing arround later…]


The *new* user account for the User Code Service has not the corresponding right to access the Performance Counters it needs to do it’s job.

PowerShell Script “SPFolderImport” for import a whole file system folder to a SharePoint document library using the Content Migration API

The last days I’ve created a PowerShell script that enables you to import a whole file system folder into a SharePoint document library using the SharePoint Content Migration API.

It does not create a library and does not create folders and does not upload files.

Instead of that it creates an import structure in an output directory and imports the structure by using the Content Migration API.

See this MSDB article for further information on that:

Here I’d like to describe in short how it works.

You can download the script and some demo files from Codeplex:

This script may be used for importing folders and files to SharePoint. But it also shows how to use the Content Migration API in a PowerShell script.


1. This is the SPFolderImport package folder:


The “SPFolderImport.ps1” file is the script that can be used to execute import tasks. You can import the script into your own scripts as shown in “demo.ps1”

2. “demo.ps1” shows you how to use the “SPFolderImport” script:


Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue; cls

."$(split-path $MyInvocation.MyCommand.Path)SPFolderImport.ps1"

$outputDirectory = "$($env:Temp)SPFolderUplodate_tmp"

$result = SPFolderImport -inDir "$(split-path $MyInvocation.MyCommand.Path)demoData" -outDir $outputDirectory `
            -dstUrl "http://sharepoint.local/shared documents/SPFolderImport_test" `
            -contentTypeMapping @(@{Type="File"; Filter="*"; ContentTypeName="Document 2"}, @{Type="Folder"; Filter="*"; ContentTypeName="Folder 2"}) `
            -executeImport $false -copyfilesToImportPackage $true -removeExistingDestFolder


At first you must have loaded the SharePoint PowerShell extension. After that you need to load the SPFolderImport script itself. In my case it’s in the same directory as the demo script. You could rename “SPFolderImport.ps1” to “SPFolderImport.psm1” and copy it to you personal PowerShell Module store in your Windows user profile or to the local computer PowerShell module repository.

The next step is to specify an output folder. There the import structure will be  created.

The last four lines belong to 1 function call. This is the call of the “SPFolderImport” function.

Parameter list for “SPFolderImport” function:

inDir folder to import to SharePoint document library
outDir folder for the import structure created by the script
dstUrl destination URL for the imported folder. It’s not the root folder of an document library. Instead it the URL of the folder that will be created during import!
contentTypeMapping here you can specify Content Type mappings for the import process. For each mapping you need to add a hash table like this:

@{Type=”<type>”; Filter=”<filter>”; ContentTypeName=”<content_Type_name>”}


= “File” or “Folder”


= it’s a string like “*” (=> all files) or “*.docx” (=> files that end with “.dcox” or “A*” (=> files that start with “A”)


= a name of a content type that is already assigned to the destination document library.

executeImport (switch) If set the import process is executed immediatly. Otherwise just create the import structure in the output folder.

default: false = do not execute
copyFilesToImportPackage (switch) If set, the file for import will be copied to the import structure (output folder)

default: true = do copy files
removeExistingDestFoder (switch) If set the destination folder in the destination document library will be deleted if it exists. Otherwise the folder will not be deleted and the script stops if the folder exists.

default: false = do not delete
retainObjectIdentity (switch) If set the object identity will be retained. For further information please read by Stefan Großner (“Problem 1”)

default: true = retain object identity

useFSTimestamps (switch) If set the objects (folders and files) properties “Created” and “Modified” will set to the file system attributes “CreationTime” and “LastWriteTime” of the objects.

default: false = do not use file system time stamps
quiet (switch) If set the script will not write any output using “write-host”. Otherwise it reports the steps of processing.


3. Let’s have a look at the process.

This is the root folder of my demo data:


I’m working on a team site with URL “http://sharepoint.local”

On the site I’ve created two site content types “Document 2” and “Folder 2”:



“Document 2” has an additional site columns (“Start Date”). This and “Title” column are mandatory:



“Start Date” has a default value of data type “Date”: “Today”. Instead “Title” has no default value. => This is interesting later.


Now I create a document library: “My SPFolderImport test library”:


Than I enabled “Content Type management” in the library settings (=> Advanced Settings) and added my new content types “Document 2” and “Folder 2” and added columns “Title” and “Start Date” to the default view.




Than I copied the the link of the library’s source folder and inserted it in the demo script extended by the name of the destination import folder:


BEFORE execution the library is empty:


Than I execute the “demo.ps1” script.

This is the script output:


Lets have a look into the output folder that contains the import structure:


This contains all nessessary files and information to feed the Content Migration API of SharePoint and import the files and folders to the document library.

This is the document library AFTER import:


Inside the “SPFolderImport” folder of the document library:


Inside “Folder 2 (DOCX)”:


As you see:

  • The “modified” date is “now”
  • The “modified by” user is the user that executes the script


  • The required “Start Date” column is filled with it’s default value (“Today”)
  • the  required “Title” column ist empty (!!!)

Let’s edit one files properties:


As you see:

  • the Content type is set to “Document 2” like specified in the mapping in the demo script
  • the “title” field is empty => you will not be able to save the document properties now without entering a value for “Title”!

Let’s have a look on a folders properties:


The content type is set as specified in the mapping in the demo script


Last words:

  • I’ve tested the script in many ways
    • Folder in default “Shared Documents” library on site collection root web
    • Folder in default “Shared Documents” library on sub web of a site collection
    • Sub Folder in default “Shared Documents” library on sub web of a site collection
    • all that with custom document library created in the browser.
  • Possible improvements
    • Import versions of files
    • Implement user mapping or file and/or folder import

If you intend to use this in an production environment please do lots of tests before you go ahea!