PowerShell Script to Migrate FBA Users from SharePoint 2007 to 2010 *Including FBA Roles*

This days I did some migration work. For experimental purpose I configures my old MOSS 2007 demo machine to use ASP.NET SQL FBA including MySites and profiles for the FBA users.

First I migrated the old Shared Service Provider config db as new User Profile Service App profile DB.


Then I migrated the content databases of a demo web app and the dedicated mysites web app.

After that I configured FBA for both web apps.

The next step was to migrate the “old” user accounts to claims accounts.

Look at the content databases. This is how the “UserInfo” table look before migration:


On this point I need to ensure that the web application is already set up to use FBA. The role provider and membership provider names *must* be the same as in 2007!!!!!!!!!

Therefore I executed


on both web apps. ($webApp is an object that I retrieved by using cmdlet Get-SPWebApplication).

After that the content databases UserInfo table looks like this:


THERE IS A PROBLEM!!!! Look at this claim login for example:


According to Wictors description of the claim structure:

The SharePoint 2010 claim encoding format


… this is WRONG!!! “i:0#.f” indicates a user logon name. But “allfbausers” is a FBA role!

The “i:0#.f” must be translated to “c:-.f” which means:

c: . f
Other Claim is a role datatype is string claim is forms AuthN


It must be migrated manually by using this PowerShell script:


If you do not do this step your “old” FBA roles will not work as expected!!! This was my big issue the last days until I figured out that roles are translated to claims the same way as user identities… This was of course not correct.

After executing the script the content database looks like this:




The next step is to migrate the profiles in the User Profile Service App…

Before migration the UserProfile_Full table of the User Profile Service Apps “Profile” database looks like this:


Then I executed the “MigrateFormsLegacyUsersToFormsClaims” on the User Profile Service Application using PowerShell.

$upa = Get-SPServiceApplication | where-object {$_.Name -eq $upaName} 

$upaName contains the name of the existing User Profile Service App.

If you get an error in the ULS log like this:


Error messages:

  • Exception occured while connecting to WCF endpoint: System.ServiceModel.Security.SecurityAccessDeniedException: Access is denied.
  • UserProfileApplicationProxy.InitializePropertyCache: Microsoft.Office.Server.UserProfiles.UserProfileException: System.ServiceModel.Security.SecurityAccessDeniedException
  • Failure retrieving application ID for User Profile Application Proxy ‘User Profile Service Application Proxy’: System.NullReferenceException: Object reference not set to an instance of an object.
  • Failure retrieving application ID for User Profile Application Proxy ‘User Profile Service Application Proxy’: System.NullReferenceException: Object reference not set to an instance of an object.
  • MigrateFormsLegacyToFormsClaims.Migrate: User: AspnetSqlMembers:employee1, Failed to migrate to: i:0#.f|aspnetsqlmembers|employee1, Exception: System.ArgumentNullException: Value cannot be null. Parameter name: userProfileApplicationProxy

…you need to assign “Full Control” permissions to the User Profile Service App for the executing user!  Otherwise you are not able to convert the users!

After executing the script above the database looks like this:



I’ve assembled all scripts for this article in one PowerShell script. You can download it here:


PowerShell Script to Add Account to “Allow Logon Locally” privilege on Local Security Policy

As you know the SharePoint Farm Account must have privileges to logon locally for getting “User Profile Service Application” to work.

Today I created a PowerShell script that adds the given account to the “Allog Logon Locally” privilege in the Local Security Policy.

1. My account is “DOMAINsp_farm”

2. I start “secpol.msc” (“Local Security Policy”) on the local farm server


3. I’m looking for “Allow Logon Locally”. The account “sp_farm” is not in this setting.


4. I execute the script to add the account.


5. Then I reload the “Local Security Policy” or close and reopen the MMC.


6. Now the account in in the setting:


You can download the script here:


This is the script:


New Demo Project Released: SharePoint Web Change Log – An Alternate Notification Feature

I created an alternated notification feature for SharePoint 2010. It’s a demo project for SharePoint 2010. I’ve done it for some practice in SharePoint development and just for fun 🙂

It’s intended to replace the default notification feature of SharePoint 2010 where you can subscribe to notifications list based. – With my feature a user can subscribe to all changes of a SharePoint Web by using a menu entry in the Personal Actions menu.

The notification mail is send to any subscribing user once a day. (Please notice that at the moment there is no security trimming for the notification mail!)

Project site: http://spwebchangelog.codeplex.com


How it works:

1. There is a web scoped feature and a farm scoped feature.

2. The web scoped feature is responsible for the Personal Actions menu entry and the change log at web scope.


3. The farm scoped feature deploys a timer job that scans each web every day and sends the notification mail if there are any changes in the web.


4. The job can be scheduled as you like.

5. On each web where the web scoped feature is active, there are two hidden lists:


This list contains an list item for each user that has subscribed for notifications. If a users unsubscribes the list item is removed.


6. If the web feature is active the “Change Log” list will contain a list item for each change in other lists of the web.

A list event receiver recognizes each list level change: Created lists, deleted lists.  It adds list item event receivers to each list in the web.

A list item event receiver creates items in the “Change Log” list for each list item action: add, update, delete.

7. If the web scoped feature is deactivated the list event receiver and all list item event receivers are removed. If the feature gets activated the list event receiver and a list item event receiver for each existing list are registered.

8. The farm scoped feature deploys a timer job that scans each web of a specific web application. If the web feature is active in a web the timer job looks for the change log list and for subscribers. If there are at least one subscriber and at least one one change since the last job run the notification mail is send.


9. It’s localized for german and english. The notification mail text is part of a resource file. But the resource file value for the mail text can be replaced by using a Web Property.


10. The notification mail is not security trimmed! That’s important for use in a production environment!

11. It’s tested in both a german and an english SharePoint system with both language packs, with multiple site collections and multiple webs and sub webs. I’d like to hear your experiences. Please report any bug. Feel free to modify it but please send me your improvements!

New Tool to Manage Users and Roles for ASP.NET Membership Provider Based Form Based Authentication (FBA)

It’s a nightmare to create users for FBA, isn’t it? – There are several tools out there, but some does not work as expected oder you need to install .NET 4 on a server just to run a simple ASP.NET app that does this job.

In the last 32 minutes ( 😉 ) I created a simple .NET 3.5 based command line tool that enables me (and you) to create and “manage” users for Form Based Authentication.

You can use the tool in the classic command shell, in a batch or in a PowerShell script. – I’ll translate it to plain PowerShell.


There is no syntax check of special error handling!


After download you need to modify the “ikfbatool.exe.config” file and modify this line:

<add name="aspnetdb" connectionString="Data Source=sps2010;Integrated Security=SSPI;Initial Catalog=aspnetdb"/>



Action Command Parameter
Create User cu <username> <password> <email> <question> <answer>
Create Role cr <rolename>
List Users lu (none)
List Roles lr (none)
Add User to Role au <username> <rolename>
List User Roles ur <username>
Remove User from Role rr <username> <rolename>
Delete Uer du <username>
Delete Role dr <rolename>
Reset Password rp <username> [<answer>]
Unlock User un <username>


Usage samples:



You can download the VS 2010 project here:



Or you create your own Visual Studio 2010 Console Application project (.NET 3.5) and past the following code into “program.cs”. You need to add a reference to System.Web.


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Web.Security;

namespace ik.SharePoint2010.fbatool
    class Program
        static void Main(string[] args)
                if( args.Length < 1 )
No warranty. Provided as ""as is"". Use it at your own risk!

#create user
cu username password email question answer 

#create role
cr rolename

#list users

#list roles

#add user to role
ar username rolename

#list user roles
ur username

#delete user
du username

#delete role
dr rolename

#delete user from role  (""role remove"")
rr username rolename

#reset password
rp username 

#unlock user (""UNlock user"")
un username


                if( args[0] == "cu" )
                    MembershipCreateStatus status;
                    Membership.CreateUser(args[1], args[2], args[3], args[4], args[5], true, out status);

                if( args[0] == "cr" )

                if( args[0] == "lu" )
                    foreach( MembershipUser u in Membership.GetAllUsers() )

                if( args[0] == "au" )
                    Roles.AddUsersToRole(new string[] { args[1] }, args[2]);

                if( args[0] == "ur" )
                    foreach( var u in Roles.GetRolesForUser(args[1]) )

                if( args[0] == "du" )

                if( args[0] == "dr" )

                if( args[0] == "rr" )
                    Roles.RemoveUserFromRole(args[1], args[2]);

                if( args[0] == "rp" )
                    if( string.IsNullOrEmpty(args[2]) )

                if( args[0] == "un" )

                if( args[0] == "lr" )
                    foreach( var u in Roles.GetAllRoles() )

            catch( Exception ex )
                var c = Console.ForegroundColor;
                Console.ForegroundColor = ConsoleColor.Red;
                Console.ForegroundColor = c;

Now you need to add and configure a “Application Config File” (app.config) with the following content:

<?xml version="1.0" encoding="utf-8" ?>
    <add name="aspnetdb" connectionString="Data Source=sps2010;Integrated Security=SSPI;Initial Catalog=aspnetdb"/>
</connectionStrings> <system.web> <membership defaultProvider="MembershipProvider"> <providers> <clear/> <add name="MembershipProvider" connectionStringName="aspnetdb" passwordAttemptWindow="10" enablePasswordRetrieval="false" enablePasswordReset="true" applicationName="/" passwordFormat="Hashed" minRequiredNonalphanumericCharacters="0" passwordStrengthRegularExpression="" requiresQuestionAndAnswer="false" requiresUniqueEmail="false" minRequiredPasswordLength="3" type="System.Web.Security.SqlMembershipProvider, System.Web, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/> </providers> </membership> <roleManager enabled="true" defaultProvider="RoleManager" > <providers> <clear/> <add name="RoleManager" type="System.Web.Security.SqlRoleProvider, System.Web, Version=, Culture=neutral, publicKeyToken=b03f5f7f11d50a3a" connectionStringName="aspnetdb" applicationName="/"/> </providers> </roleManager> </system.web> </configuration>


You need to manipulate the yellow marked line to meet your system configuration.

The “aspnetdb” you have previously created with “aspnet_regsql.exe”. – You should be able to use any other ASP.NET MemberShip provider.

Issue starting User Code Service on SharePoint 2010

Today I started the “User Code Service” of SharePoint 2010 to enable execution of Sandboxed Solutions.

Before activation I’ve created a new domain account for the service (*), created a managed account for the new domain account and changed the service credentials in the Central Administration

(CA => Security => Configure service accounts =>


(*) This is the important step to run into the issue.


After starting the Service on a couple of machines (“Services on Server” page on CA => System Settings) I tried to upload and activate my Sandboxed Solution.

But without success:


(“No available sandboxed code execution server could be found.”)


The Problem: The corresponding Windows service “SharePoint 2010 User Code Host” A.K.A “SPUserCodeV4” was not running but configured for the right account:



After starting it manually I got this error(s) in the ULS log:


[Some BINGing arround later…]



The *new* user account for the User Code Service has not the corresponding right to access the Performance Counters it needs to do it’s job.

PowerShell Script “SPFolderImport” for import a whole file system folder to a SharePoint document library using the Content Migration API

The last days I’ve created a PowerShell script that enables you to import a whole file system folder into a SharePoint document library using the SharePoint Content Migration API.

It does not create a library and does not create folders and does not upload files.

Instead of that it creates an import structure in an output directory and imports the structure by using the Content Migration API.

See this MSDB article for further information on that:

Here I’d like to describe in short how it works.

You can download the script and some demo files from Codeplex: http://spfolderimport.codeplex.com

This script may be used for importing folders and files to SharePoint. But it also shows how to use the Content Migration API in a PowerShell script.


1. This is the SPFolderImport package folder:


The “SPFolderImport.ps1” file is the script that can be used to execute import tasks. You can import the script into your own scripts as shown in “demo.ps1”

2. “demo.ps1” shows you how to use the “SPFolderImport” script:


Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue; cls

."$(split-path $MyInvocation.MyCommand.Path)SPFolderImport.ps1"

$outputDirectory = "$($env:Temp)SPFolderUplodate_tmp"

$result = SPFolderImport -inDir "$(split-path $MyInvocation.MyCommand.Path)demoData" -outDir $outputDirectory `
            -dstUrl "http://sharepoint.local/shared documents/SPFolderImport_test" `
            -contentTypeMapping @(@{Type="File"; Filter="*"; ContentTypeName="Document 2"}, @{Type="Folder"; Filter="*"; ContentTypeName="Folder 2"}) `
            -executeImport $false -copyfilesToImportPackage $true -removeExistingDestFolder


At first you must have loaded the SharePoint PowerShell extension. After that you need to load the SPFolderImport script itself. In my case it’s in the same directory as the demo script. You could rename “SPFolderImport.ps1” to “SPFolderImport.psm1” and copy it to you personal PowerShell Module store in your Windows user profile or to the local computer PowerShell module repository.

The next step is to specify an output folder. There the import structure will be  created.

The last four lines belong to 1 function call. This is the call of the “SPFolderImport” function.

Parameter list for “SPFolderImport” function:

inDir folder to import to SharePoint document library
outDir folder for the import structure created by the script
dstUrl destination URL for the imported folder. It’s not the root folder of an document library. Instead it the URL of the folder that will be created during import!
contentTypeMapping here you can specify Content Type mappings for the import process. For each mapping you need to add a hash table like this:

@{Type=”<type>”; Filter=”<filter>”; ContentTypeName=”<content_Type_name>”}


= “File” or “Folder”


= it’s a string like “*” (=> all files) or “*.docx” (=> files that end with “.dcox” or “A*” (=> files that start with “A”)


= a name of a content type that is already assigned to the destination document library.

executeImport (switch) If set the import process is executed immediatly. Otherwise just create the import structure in the output folder.

default: false = do not execute
copyFilesToImportPackage (switch) If set, the file for import will be copied to the import structure (output folder)

default: true = do copy files
removeExistingDestFoder (switch) If set the destination folder in the destination document library will be deleted if it exists. Otherwise the folder will not be deleted and the script stops if the folder exists.

default: false = do not delete
retainObjectIdentity (switch) If set the object identity will be retained. For further information please read http://blogs.technet.com/b/stefan_gossner/archive/2009/01/16/content-deployment-best-practices.aspx by Stefan Großner (“Problem 1”)

default: true = retain object identity

useFSTimestamps (switch) If set the objects (folders and files) properties “Created” and “Modified” will set to the file system attributes “CreationTime” and “LastWriteTime” of the objects.

default: false = do not use file system time stamps
quiet (switch) If set the script will not write any output using “write-host”. Otherwise it reports the steps of processing.


3. Let’s have a look at the process.

This is the root folder of my demo data:


I’m working on a team site with URL “http://sharepoint.local”

On the site I’ve created two site content types “Document 2” and “Folder 2”:



“Document 2” has an additional site columns (“Start Date”). This and “Title” column are mandatory:



“Start Date” has a default value of data type “Date”: “Today”. Instead “Title” has no default value. => This is interesting later.


Now I create a document library: “My SPFolderImport test library”:


Than I enabled “Content Type management” in the library settings (=> Advanced Settings) and added my new content types “Document 2” and “Folder 2” and added columns “Title” and “Start Date” to the default view.




Than I copied the the link of the library’s source folder and inserted it in the demo script extended by the name of the destination import folder:


BEFORE execution the library is empty:


Than I execute the “demo.ps1” script.

This is the script output:


Lets have a look into the output folder that contains the import structure:


This contains all nessessary files and information to feed the Content Migration API of SharePoint and import the files and folders to the document library.

This is the document library AFTER import:


Inside the “SPFolderImport” folder of the document library:


Inside “Folder 2 (DOCX)”:


As you see:

  • The “modified” date is “now”
  • The “modified by” user is the user that executes the script


  • The required “Start Date” column is filled with it’s default value (“Today”)
  • the  required “Title” column ist empty (!!!)

Let’s edit one files properties:


As you see:

  • the Content type is set to “Document 2” like specified in the mapping in the demo script
  • the “title” field is empty => you will not be able to save the document properties now without entering a value for “Title”!

Let’s have a look on a folders properties:


The content type is set as specified in the mapping in the demo script


Last words:

  • I’ve tested the script in many ways
    • Folder in default “Shared Documents” library on site collection root web
    • Folder in default “Shared Documents” library on sub web of a site collection
    • Sub Folder in default “Shared Documents” library on sub web of a site collection
    • all that with custom document library created in the browser.
  • Possible improvements
    • Import versions of files
    • Implement user mapping or file and/or folder import

If you intend to use this in an production environment please do lots of tests before you go ahea!

Error during farm config: New-SPConfigurationDatabase failed with exception “The process does not passess the ‘SeSecurityPriviledge’ priviledge which is required for this operation.”

Today I tried to configure a new SharePoint farm. I got this error:


“The process does not passess the ‘SeSecurityPriviledge’ priviledge which is required for this operation.” – Although the account used to configure SharePoint is local administrator!!

After searching the internet I found that issue and a solution for that.

Check the Local Group policy or the Domain Group Policy and validate the Setting

Computer Configuration => Policies => Windows Settings => Security Settings => Local Policies => User Rights Assignments => Manage auditing and security log


If this setting is defined the account that is used to install SharePoint needs to be member of this setting. – Just add the account or the “local administrators” (“.Administrators”) group.

How to install SharePoint Server 2010 Language Pack SP1.

When you install SharePoint Server 2010 by using the official slipstreamed version that includes SP1 you may get an error while installing SP1 for several language packs.

“The expected version of the product was not found on the system – Error when trying to Install SharePoint Service pack”

Today I had this problem with this language packs:

  • German
  • French
  • Spain
  • Russian

Looking around the internet there was exactly the same issue described in a forum post:


(german, sorry. Try this automatic translation: http://www.microsofttranslator.com/bv.aspx?from=de&to=en&a=http://social.technet.microsoft.com/Forums/de-DE/Sharepointde/thread/3aff60b7-bab7-48be-a890-be702e9ad85a)

…but no solution.

I’ve found several hints in the internet that this is caused by using the official “SharePoint Server 2010 with SP1” setup media.


Some times later I found this site:



I tried to use the trick for the language pack SP1 installers:


W:SP_Setup240_SP1SPSserverlanguagepack2010sp1-kb2460056-x64-fullfile-de-de.exe PACKAGE.BYPASS.DETECTION.CHECK=1


This solved my issue. I was able to install the SP1 for all affected language packs. – I’m not sure this is a valid procedure for production systems.

Redirecting to new URL with IIS

In IIS there is a trick for redirecting requests to a new URL including original path and query string. This trick worked on IIS 6. Today I tried this with IIS 7.5 on Windows Server 2008 R2. It’s still working!!!



1. You have an SharePoint with host header “sharepoint.local” and a Shared Documents library with a single document. This link will open the document properties view form:


2. Now you create a IIS redirect for “http://sharepoint-redirect.local”:

Therefore you create a new web application in IIS:


Now you add “Http Redirect” functionality to the web application:



Click “Apply” after configuration.

3. If you test it in the browser with URLs like http://sharepoint-redirect.local/Shared%20Documents it works as expected. You get http://sharepoint.local/Shared%20Documents

4. But if you try the link from above you get an unexpected result:


redirected to:


5. If you uncheck (and “Apply”) the following checkbox in the HTTP Redirect settings of the web application it looks a little bit better:




redirected to:


The URL is correct but the query string is missing. So the form is empty:


6. Now the “old” IIS trick for HTTP Redirect:

Use tokens $S$Q in the redirect URL!

$S = Path

$Q = Query String


Now it HTTP Redirect works as expected!


redirected to:




For MCTs: Classroom Setup Tools for MS Learning Hyper-V machines

(Download link below!)

If you set up a MS Learning Classroom with Hyper-V and the images provided by MS at the MCT download page you need to deploy the VHD files (and the other VM files of course) to “C:ProgrammeMicrosoft Learning<…>”

The last days I set up a new Hyper-V server at my office:

  • 128GB SSD => system
  • 2TB HDD => machines
  • 32GB RAM

I want to store the MS Learning machines on “D:” (=> 2TB).

The MS Learning machines have differential VHDs that are based on a couple of “Base VHDs”. In all differential VHD files there are references to the location of the original file:


Furthermore there are to files for every machine that contain config and export/import informations. This files have absolute file location references too.

Because this MS Learning machines are not productive Hyper-V machines I decided to manipulate a neseccary files to be able to store the machines on another drive. Manipulating this files is not supported!!!

My package contains 4 PowerShell scripts:

1. “Unpack”


This script allows you to decompress all compressed MS Learning machines in a folder to a destination folder.

To run the script you need to have “unrar.exe” in the same directory as the script. Download and execute this file ftp://ftp.rarlab.com/rar/unrarw32.exe. This is an self extracting archive that contains “unrar.exe”.

You only need to manipulate the “$list” variable: specify all source directories and destination directories you need to extract.

In my case this is the result:


2. “Patch”

This script manipulates all neseccary files.

You need to have them all in a dedicated directory. In my case everything is located under “I:Learning”. This directory is the entry point for the other scripts. In this directory all base and differential VHDs must be located.

“Patch” modifies:

  • config.xml => replace VHD paths
  • <machine-guid>.exp => replace VHD paths
  • <name>.vhd / <name>.avhd => replace VHD paths

First the “Patch” script scans the root path (e.g. “I:Learning”) and all sub directories. It saves all VHD file names.

The next steps are to open the files specified above and replace all VHD paths (“c:program files…”) with the correct paths (“I:Learning…”).

For “config.xml” und “<machine-guid>.exp” the script creates backup files with the same name and additional “.bak” extension. If you run the script twice the backup will be restored before processing.

For “<machine-guid>.exp” another file will be creates: “<machine-guid>.exp.ik”. This file is used later for patching the virtual machine in Hyper-V.

3. “ImportVMs”

This script imports the MS Learning Machines to Hyper-V using WMI (see http://social.technet.microsoft.com/Forums/en-US/winserverClustering/thread/a4e0d12d-534c-41f6-8038-4e8a7dbbba15).


It imports every machine found in the sub directories of the given machine root: “I:Learning”

4. “mountVHDs”

You maybe need to mount the VHDs after importing.

This is a part of the “Classroom Setup Guide” for one training:


In my case this happens for all machines!!! – In case of course no. “10174” (SharePoint Config and Admin) you would have to pathc 15 machines with about 20 VHD files. You would have to repeat mounting the VHDs on every classroom machine!!! – NO WAY!


While importing the machines you get this event log entries in the Hyper-V VMMS/Admin event log:

  1. Import task failed to fix up connection information for connection ‘I:Learning10174Drives10174A-SP2010-WFE1-FINALVirtual Hard Disks10174A-SP2010-WFE1-FINAL.vhd’. (Event Id 18430)
  2. Import task failed to fix up connection information for connection ‘I:Learning10174Drives10174A-SP2010-WFE1-FINALVirtual Hard Disks10174A-SP2010-WFE1-Allfiles-diff.vhd’. (Event Id 18430)
  3. ‘10174A-SP2010-WFE1-FINAL’: The file name ” is invalid. You cannot use the following names (LPTn, COMn, PRN, AUX, NUL, CON) because they are reserved by Windows. (Virtual machine ID FE26F0D1-B18C-440D-8CB6-792DBB6E1A5B) (Event Id 12634)
  4. ‘10174A-SP2010-WFE1-FINAL’ failed to add device ‘Microsoft Virtual Hard Disk’. (Virtual machine ID FE26F0D1-B18C-440D-8CB6-792DBB6E1A5B) (Event Id 14140)
  5. Failed to import correctly the device ‘{ResourceType=21, OtherResourceType="<null>", ResourceSubType="Microsoft Virtual Hard Disk"}’ for ‘10174A-SP2010-WFE1-FINAL’ (Virtual machine ID FE26F0D1-B18C-440D-8CB6-792DBB6E1A5B). Error: Invalid parameter (0x80041008) . (Event Id 18130)
  6. Import completed with warnings. Please check the Admin events in the Hyper-V-VMMS event log for more information. (Event Id 18250)

(This are examples! The paths and names will vary!)




The script “MountVHDs” will modify all machines in Hyper-V. ALL machines!! Be very carefull!! Again: Don’t use the scripts in a production environment!!!

To execute this script you need to install the PowerShell Hyper-V module: http://pshyperv.codeplex.com/

The script opens every machine. Than it searches in the MS Learning Root (“i:learning”) for the file “<machine-guid>.exp.ik”. If there is such a file the script modifies the virtual machine and mounts the VHDs as configured in the “EXP.IK” file.

For me it works perfectly! – Now I can set up my classroom server in few minutes!

Here you can download the scripts: http://pscst.codeplex.com – Please feel free to improve them – and to send me your improvements!!!