Short Note About Error While Creating Search Service Application for SharePoint 2013 by PowerShell: “Value cannot be null. Parameter name: indexLocation”

Today I got an error while creating a Search Service Application for SharePoint 2013:

PS C:\> $SearchSA = New-SPEnterpriseSearchServiceApplication –Name “Enterprise Search Service Application” –ApplicationPool “Search App Pool” –DatabaseName “Search”

New-SPEnterpriseSearchServiceApplication : Value cannot be null.
Parameter name: indexLocation

To resolve this I just started the Search Service Instance on each (search) server in the farm and set it’s “DefaultIndexLocation” property.

After that I could create the Search Service App.

PS snippet:

"SearchServer1", "SearchServer2" | % {
    $svcInst = (Get-SPServer -Identity $_).serviceinstances | ? { $_.GetType().FullName -eq "Microsoft.Office.Server.Search.Administration.SearchServiceInstance" }
    $svcInst.DefaultIndexLocation = $defaultIndexLocation
    $svcInst.Update()
    $svcInst.Provision()
}

This did it.

How To: Use Git for small dev projects with “private” GIT repositories based on cloud storage providers such as SugarSync or DropBox

In some small dev projects in the last months I was looking for a source control system. I liked to use Microsoft’s Team Foundation Services. But I was and I am not able to use them with my current VS 2010. I cannot use the final version (http://tfs.visualstudio.com) because of a bug in (my?) Visual Studio. (Forum thread related to this problem: http://social.msdn.microsoft.com/Forums/en-US/TFService/thread/1b6673ec-8c42-4896-9049-49f17b85bf65)

Anyway…

So I wonder if I could use Git. – It’s a “[…] a distributed revision control and source code management (SCM) system with an emphasis on speed.” (Source: Wikipedia).

There is GitHub.com where you can host public visible projects for free. For “private” projects you have to pay (https://github.com/plans). The pricing is fair! But I was looking for a cost opportunity.

For cloud storage purpose I’m using this cloud storage providers:

(For data protection on cloud storage I use BoxCryptor.)

For this article I use SugarSync.

In my scenario I have some source code to share between me and other project members. We like to code together. And we need some source control features… With a cloud storage provider I can share local folders with other people… So now:

My aim is to create a cloud storage based source code repository for a small project and share the repository with other developers to work on the same project. My aim *is not* to describe the basics of and the need for source control in software development. (And I do not describe why to do all the steps 😉 )

This are the step to do so:

  1. Download and install Git for Windows: http://code.google.com/p/msysgit/downloads/list: “Full installer for official Git for Windows 1.8.0 Featured Beta”
  2. Download and install Git Extensions: http://code.google.com/p/gitextensions/downloads/list: “Git Extensions 2.43 Windows installer”. This are some very usefull “GUI” extensions for Git.
  3. Download and install Git Source Control Provider for Visual Studio (2012, 2010, 2008): http://visualstudiogallery.msdn.microsoft.com/63a7e40d-4d71-4fbb-a23b-d262124b8f4c
  4. Download and install – if you like – PowerShell extensions for Git: https://github.com/dahlbyk/posh-git
  5. Register for a cloud storage, e.g. SugarSync.
  6. Download and install the cloud storage software on your dev machine and log in.
  7. Open Visual Studio and create your project as normal. Or open an existing project. It’s the same procedure in both situations.

    image

  8. Right click on the project node or solution node in the Solution Explorer and click “Create Git Repository”

    image

  9. Right click on the project and “commit” the changes.

    image

    image

    image

  10. Open Git Shell (PowerShell or Bash), navigate to the project source folder.

    image

    Here you can see that to source folder is recognized as Git enabled folder.

  11. The next step is to create the connection to the repository that is shared between project members.
  12. Create a local folder for your Git repository and map the folder to your cloud storage.

    image

    image

  13. Create an empty folder inside the local repository folder. In this folder the project repository will be created.

    image

  14. Open Git Extensions and create a new repository:

    image

  15. Now you need to connect the local project folder with the repository.
    • git remote add origin “//localhost/c$/github/test project 1”

      image

  16. The next steps are to create a remote branch and to connect the local branch (“master”) with the remote branch
    • git push origin master:refs/heads/master

      image

    • git branch -u origin/master master

      image

  17. Now you can use “git push” to upload local changes to the repository and “git pull” to download changes from the repository to the local project folder.
  18. The final step is to share the cloud storage hosted folder with other project members:

    image

    image

That’s it for the publisher.

The next step is that an invited project member needs to sync the repository folder to it’s local drive and do the following steps:

  1. I simulate this by creating a new local project directory “c:\source\test project 1 other member”.

    image

  2. Then initialize the directory with “git init

    image

  3. Now you need to connect the local project folder with the repository.
    • git remote add origin “//localhost/c$/github/test project 1”

      image
    • git pull origin master:refs/heads/master 

      image

  4. The last step is to set the local branch to “track” the remote branch.

    image

That’s it, again. This are the basics about how to create a local project, a cloud hosted repository, how to share the repository and how to connect to it on a project members site.

SQL Server Alias management with PowerShell and WMI – Update for SQL Server 2012

In 02/2011 I created a PowerShell script to set / get / remove SQL Server aliases:

https://blog.kenaro.com/2011/02/09/enumerate-add-update-and-remove-sql-server-aliases-by-using-powershell/

Today I updated it for SQL Server 2012 and tested it on Windows Server 2012 and Windows Server 2008 R2.

You can download it here:

http://gallery.technet.microsoft.com/SQL-Server-2008-2012-Alias-baf05737

PowerShell Script to Migrate FBA Users from SharePoint 2007 to 2010 *Including FBA Roles*

This days I did some migration work. For experimental purpose I configures my old MOSS 2007 demo machine to use ASP.NET SQL FBA including MySites and profiles for the FBA users.

First I migrated the old Shared Service Provider config db as new User Profile Service App profile DB.

1.

Then I migrated the content databases of a demo web app and the dedicated mysites web app.

After that I configured FBA for both web apps.

The next step was to migrate the “old” user accounts to claims accounts.

Look at the content databases. This is how the “UserInfo” table look before migration:

image

On this point I need to ensure that the web application is already set up to use FBA. The role provider and membership provider names *must* be the same as in 2007!!!!!!!!!

Therefore I executed

$webApp.MigrateUsers($true)

on both web apps. ($webApp is an object that I retrieved by using cmdlet Get-SPWebApplication).

After that the content databases UserInfo table looks like this:

image

THERE IS A PROBLEM!!!! Look at this claim login for example:

image

According to Wictors description of the claim structure:

The SharePoint 2010 claim encoding format

http://www.wictorwilen.se/Media/Default/Windows-Live-Writer/How-Claims-encoding-works-in-SharePoint-_14813/image_10.png

… this is WRONG!!! “i:0#.f” indicates a user logon name. But “allfbausers” is a FBA role!

The “i:0#.f” must be translated to “c:-.f” which means:

c: . f
Other Claim is a role datatype is string claim is forms AuthN

 

It must be migrated manually by using this PowerShell script:

image

If you do not do this step your “old” FBA roles will not work as expected!!! This was my big issue the last days until I figured out that roles are translated to claims the same way as user identities… This was of course not correct.

After executing the script the content database looks like this:

 

image 

2.

The next step is to migrate the profiles in the User Profile Service App…

Before migration the UserProfile_Full table of the User Profile Service Apps “Profile” database looks like this:

image

Then I executed the “MigrateFormsLegacyUsersToFormsClaims” on the User Profile Service Application using PowerShell.

$upa = Get-SPServiceApplication | where-object {$_.Name -eq $upaName} 
$upa.MigrateFormsLegacyUsersToFormsClaims()
$upa.Upgrade()

$upaName contains the name of the existing User Profile Service App.

If you get an error in the ULS log like this:

image

Error messages:

  • Exception occured while connecting to WCF endpoint: System.ServiceModel.Security.SecurityAccessDeniedException: Access is denied.
  • UserProfileApplicationProxy.InitializePropertyCache: Microsoft.Office.Server.UserProfiles.UserProfileException: System.ServiceModel.Security.SecurityAccessDeniedException
  • Failure retrieving application ID for User Profile Application Proxy ‘User Profile Service Application Proxy’: System.NullReferenceException: Object reference not set to an instance of an object.
  • Failure retrieving application ID for User Profile Application Proxy ‘User Profile Service Application Proxy’: System.NullReferenceException: Object reference not set to an instance of an object.
  • MigrateFormsLegacyToFormsClaims.Migrate: User: AspnetSqlMembers:employee1, Failed to migrate to: i:0#.f|aspnetsqlmembers|employee1, Exception: System.ArgumentNullException: Value cannot be null. Parameter name: userProfileApplicationProxy

…you need to assign “Full Control” permissions to the User Profile Service App for the executing user!  Otherwise you are not able to convert the users!

After executing the script above the database looks like this:

image

3.

I’ve assembled all scripts for this article in one PowerShell script. You can download it here:

http://gallery.technet.microsoft.com/PowerShell-script-to-25b971ba

PowerShell Script to Add Account to “Allow Logon Locally” privilege on Local Security Policy

As you know the SharePoint Farm Account must have privileges to logon locally for getting “User Profile Service Application” to work.

Today I created a PowerShell script that adds the given account to the “Allog Logon Locally” privilege in the Local Security Policy.

1. My account is “DOMAINsp_farm”

2. I start “secpol.msc” (“Local Security Policy”) on the local farm server

image

3. I’m looking for “Allow Logon Locally”. The account “sp_farm” is not in this setting.

image

4. I execute the script to add the account.

image

5. Then I reload the “Local Security Policy” or close and reopen the MMC.

image

6. Now the account in in the setting:

image

You can download the script here:

http://gallery.technet.microsoft.com/PowerShell-script-to-add-b005e0f6

This is the script:

image

New Demo Project Released: SharePoint Web Change Log – An Alternate Notification Feature

I created an alternated notification feature for SharePoint 2010. It’s a demo project for SharePoint 2010. I’ve done it for some practice in SharePoint development and just for fun 🙂

It’s intended to replace the default notification feature of SharePoint 2010 where you can subscribe to notifications list based. – With my feature a user can subscribe to all changes of a SharePoint Web by using a menu entry in the Personal Actions menu.

The notification mail is send to any subscribing user once a day. (Please notice that at the moment there is no security trimming for the notification mail!)

Project site: http://spwebchangelog.codeplex.com

 

How it works:

1. There is a web scoped feature and a farm scoped feature.

2. The web scoped feature is responsible for the Personal Actions menu entry and the change log at web scope.

image

3. The farm scoped feature deploys a timer job that scans each web every day and sends the notification mail if there are any changes in the web.

image

4. The job can be scheduled as you like.

5. On each web where the web scoped feature is active, there are two hidden lists:

image

This list contains an list item for each user that has subscribed for notifications. If a users unsubscribes the list item is removed.

image

6. If the web feature is active the “Change Log” list will contain a list item for each change in other lists of the web.

A list event receiver recognizes each list level change: Created lists, deleted lists.  It adds list item event receivers to each list in the web.

A list item event receiver creates items in the “Change Log” list for each list item action: add, update, delete.

7. If the web scoped feature is deactivated the list event receiver and all list item event receivers are removed. If the feature gets activated the list event receiver and a list item event receiver for each existing list are registered.

8. The farm scoped feature deploys a timer job that scans each web of a specific web application. If the web feature is active in a web the timer job looks for the change log list and for subscribers. If there are at least one subscriber and at least one one change since the last job run the notification mail is send.

image

9. It’s localized for german and english. The notification mail text is part of a resource file. But the resource file value for the mail text can be replaced by using a Web Property.

image

10. The notification mail is not security trimmed! That’s important for use in a production environment!

11. It’s tested in both a german and an english SharePoint system with both language packs, with multiple site collections and multiple webs and sub webs. I’d like to hear your experiences. Please report any bug. Feel free to modify it but please send me your improvements!

New Tool to Manage Users and Roles for ASP.NET Membership Provider Based Form Based Authentication (FBA)

It’s a nightmare to create users for FBA, isn’t it? – There are several tools out there, but some does not work as expected oder you need to install .NET 4 on a server just to run a simple ASP.NET app that does this job.

In the last 32 minutes ( 😉 ) I created a simple .NET 3.5 based command line tool that enables me (and you) to create and “manage” users for Form Based Authentication.

You can use the tool in the classic command shell, in a batch or in a PowerShell script. – I’ll translate it to plain PowerShell.

 

There is no syntax check of special error handling!

 

After download you need to modify the “ikfbatool.exe.config” file and modify this line:

<add name="aspnetdb" connectionString="Data Source=sps2010;Integrated Security=SSPI;Initial Catalog=aspnetdb"/>

 

Commands:

Action Command Parameter
Create User cu <username> <password> <email> <question> <answer>
Create Role cr <rolename>
List Users lu (none)
List Roles lr (none)
Add User to Role au <username> <rolename>
List User Roles ur <username>
Remove User from Role rr <username> <rolename>
Delete Uer du <username>
Delete Role dr <rolename>
Reset Password rp <username> [<answer>]
Unlock User un <username>

 

Usage samples:

image

 

You can download the VS 2010 project here:

http://gallery.technet.microsoft.com/sharepoint/Tool-to-Manage-Users-and-c75591c4

 

Or you create your own Visual Studio 2010 Console Application project (.NET 3.5) and past the following code into “program.cs”. You need to add a reference to System.Web.

 

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Web.Security;

namespace ik.SharePoint2010.fbatool
{
    class Program
    {
        static void Main(string[] args)
        {
            try
            {
                if( args.Length < 1 )
                {
                    Console.WriteLine(@"
WRITTEN BY INGO KARSTEIN 
No warranty. Provided as ""as is"". Use it at your own risk!

-------------------------------------------------------------------
#create user
cu username password email question answer 

-------------------------------------------------------------------
#create role
cr rolename

-------------------------------------------------------------------
#list users
lu

-------------------------------------------------------------------
#list roles
lr

-------------------------------------------------------------------
#add user to role
ar username rolename

-------------------------------------------------------------------
#list user roles
ur username

-------------------------------------------------------------------
#delete user
du username

-------------------------------------------------------------------
#delete role
dr rolename

-------------------------------------------------------------------
#delete user from role  (""role remove"")
rr username rolename

-------------------------------------------------------------------
#reset password
rp username 

-------------------------------------------------------------------
#unlock user (""UNlock user"")
un username
");

                    return;
                }

                if( args[0] == "cu" )
                {
                    MembershipCreateStatus status;
                    Membership.CreateUser(args[1], args[2], args[3], args[4], args[5], true, out status);
                    Console.WriteLine(status.ToString());
                }

                if( args[0] == "cr" )
                {
                    Roles.CreateRole(args[1]);
                }

                if( args[0] == "lu" )
                {
                    foreach( MembershipUser u in Membership.GetAllUsers() )
                    {
                        Console.WriteLine(u.UserName);
                    }
                }

                if( args[0] == "au" )
                {
                    Roles.AddUsersToRole(new string[] { args[1] }, args[2]);
                }

                if( args[0] == "ur" )
                {
                    foreach( var u in Roles.GetRolesForUser(args[1]) )
                    {
                        Console.WriteLine(u);
                    }
                }

                if( args[0] == "du" )
                {
                    Membership.DeleteUser(args[1]);
                }

                if( args[0] == "dr" )
                {
                    Roles.DeleteRole(args[1]);
                }

                if( args[0] == "rr" )
                {
                    Roles.RemoveUserFromRole(args[1], args[2]);
                }

                if( args[0] == "rp" )
                {
                    if( string.IsNullOrEmpty(args[2]) )
                        Console.WriteLine(Membership.GetUser(args[1]).ResetPassword());
                    else
                        Console.WriteLine(Membership.GetUser(args[1]).ResetPassword(args[2]));
                }

                if( args[0] == "un" )
                {
                    Membership.GetUser(args[1]).UnlockUser();
                }

                if( args[0] == "lr" )
                {
                    foreach( var u in Roles.GetAllRoles() )
                    {
                        Console.WriteLine(u);
                    }
                }

            }
            catch( Exception ex )
            {
                var c = Console.ForegroundColor;
                Console.ForegroundColor = ConsoleColor.Red;
                Console.WriteLine(ex.Message);
                Console.ForegroundColor = c;
            }
        }
    }
}

Now you need to add and configure a “Application Config File” (app.config) with the following content:

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <appSettings/>
  <connectionStrings>
    <add name="aspnetdb" connectionString="Data Source=sps2010;Integrated Security=SSPI;Initial Catalog=aspnetdb"/>
</connectionStrings> <system.web> <membership defaultProvider="MembershipProvider"> <providers> <clear/> <add name="MembershipProvider" connectionStringName="aspnetdb" passwordAttemptWindow="10" enablePasswordRetrieval="false" enablePasswordReset="true" applicationName="/" passwordFormat="Hashed" minRequiredNonalphanumericCharacters="0" passwordStrengthRegularExpression="" requiresQuestionAndAnswer="false" requiresUniqueEmail="false" minRequiredPasswordLength="3" type="System.Web.Security.SqlMembershipProvider, System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/> </providers> </membership> <roleManager enabled="true" defaultProvider="RoleManager" > <providers> <clear/> <add name="RoleManager" type="System.Web.Security.SqlRoleProvider, System.Web, Version=2.0.0.0, Culture=neutral, publicKeyToken=b03f5f7f11d50a3a" connectionStringName="aspnetdb" applicationName="/"/> </providers> </roleManager> </system.web> </configuration>

 

You need to manipulate the yellow marked line to meet your system configuration.

The “aspnetdb” you have previously created with “aspnet_regsql.exe”. – You should be able to use any other ASP.NET MemberShip provider.

Issue starting User Code Service on SharePoint 2010

Today I started the “User Code Service” of SharePoint 2010 to enable execution of Sandboxed Solutions.

Before activation I’ve created a new domain account for the service (*), created a managed account for the new domain account and changed the service credentials in the Central Administration

(CA => Security => Configure service accounts =>

image)

(*) This is the important step to run into the issue.

 

After starting the Service on a couple of machines (“Services on Server” page on CA => System Settings) I tried to upload and activate my Sandboxed Solution.

But without success:

image

(“No available sandboxed code execution server could be found.”)

 

The Problem: The corresponding Windows service “SharePoint 2010 User Code Host” A.K.A “SPUserCodeV4” was not running but configured for the right account:

image

 

After starting it manually I got this error(s) in the ULS log:

image

[Some BINGing arround later…]

Solution

http://support.microsoft.com/kb/983081

The *new* user account for the User Code Service has not the corresponding right to access the Performance Counters it needs to do it’s job.

PowerShell Script “SPFolderImport” for import a whole file system folder to a SharePoint document library using the Content Migration API

The last days I’ve created a PowerShell script that enables you to import a whole file system folder into a SharePoint document library using the SharePoint Content Migration API.

It does not create a library and does not create folders and does not upload files.

Instead of that it creates an import structure in an output directory and imports the structure by using the Content Migration API.

See this MSDB article for further information on that:

Here I’d like to describe in short how it works.

You can download the script and some demo files from Codeplex: http://spfolderimport.codeplex.com

This script may be used for importing folders and files to SharePoint. But it also shows how to use the Content Migration API in a PowerShell script.

 

1. This is the SPFolderImport package folder:

image

The “SPFolderImport.ps1” file is the script that can be used to execute import tasks. You can import the script into your own scripts as shown in “demo.ps1”

2. “demo.ps1” shows you how to use the “SPFolderImport” script:

image

Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue; cls

."$(split-path $MyInvocation.MyCommand.Path)SPFolderImport.ps1"

$outputDirectory = "$($env:Temp)SPFolderUplodate_tmp"

$result = SPFolderImport -inDir "$(split-path $MyInvocation.MyCommand.Path)demoData" -outDir $outputDirectory `
            -dstUrl "http://sharepoint.local/shared documents/SPFolderImport_test" `
            -contentTypeMapping @(@{Type="File"; Filter="*"; ContentTypeName="Document 2"}, @{Type="Folder"; Filter="*"; ContentTypeName="Folder 2"}) `
            -executeImport $false -copyfilesToImportPackage $true -removeExistingDestFolder

 

At first you must have loaded the SharePoint PowerShell extension. After that you need to load the SPFolderImport script itself. In my case it’s in the same directory as the demo script. You could rename “SPFolderImport.ps1” to “SPFolderImport.psm1” and copy it to you personal PowerShell Module store in your Windows user profile or to the local computer PowerShell module repository.

The next step is to specify an output folder. There the import structure will be  created.

The last four lines belong to 1 function call. This is the call of the “SPFolderImport” function.

Parameter list for “SPFolderImport” function:

inDir folder to import to SharePoint document library
outDir folder for the import structure created by the script
dstUrl destination URL for the imported folder. It’s not the root folder of an document library. Instead it the URL of the folder that will be created during import!
contentTypeMapping here you can specify Content Type mappings for the import process. For each mapping you need to add a hash table like this:

@{Type=”<type>”; Filter=”<filter>”; ContentTypeName=”<content_Type_name>”}

<type>

= “File” or “Folder”

<filter>

= it’s a string like “*” (=> all files) or “*.docx” (=> files that end with “.dcox” or “A*” (=> files that start with “A”)

<content_type_name>

= a name of a content type that is already assigned to the destination document library.

executeImport (switch) If set the import process is executed immediatly. Otherwise just create the import structure in the output folder.

default: false = do not execute
copyFilesToImportPackage (switch) If set, the file for import will be copied to the import structure (output folder)

default: true = do copy files
removeExistingDestFoder (switch) If set the destination folder in the destination document library will be deleted if it exists. Otherwise the folder will not be deleted and the script stops if the folder exists.

default: false = do not delete
retainObjectIdentity (switch) If set the object identity will be retained. For further information please read http://blogs.technet.com/b/stefan_gossner/archive/2009/01/16/content-deployment-best-practices.aspx by Stefan Großner (“Problem 1”)

default: true = retain object identity

useFSTimestamps (switch) If set the objects (folders and files) properties “Created” and “Modified” will set to the file system attributes “CreationTime” and “LastWriteTime” of the objects.

default: false = do not use file system time stamps
quiet (switch) If set the script will not write any output using “write-host”. Otherwise it reports the steps of processing.

 

3. Let’s have a look at the process.

This is the root folder of my demo data:

image

I’m working on a team site with URL “http://sharepoint.local”

On the site I’ve created two site content types “Document 2” and “Folder 2”:

image

image

“Document 2” has an additional site columns (“Start Date”). This and “Title” column are mandatory:

image

image

“Start Date” has a default value of data type “Date”: “Today”. Instead “Title” has no default value. => This is interesting later.

 

Now I create a document library: “My SPFolderImport test library”:

image

Than I enabled “Content Type management” in the library settings (=> Advanced Settings) and added my new content types “Document 2” and “Folder 2” and added columns “Title” and “Start Date” to the default view.

image

image

image

Than I copied the the link of the library’s source folder and inserted it in the demo script extended by the name of the destination import folder:

image

BEFORE execution the library is empty:

image

Than I execute the “demo.ps1” script.

This is the script output:

image

Lets have a look into the output folder that contains the import structure:

image

This contains all nessessary files and information to feed the Content Migration API of SharePoint and import the files and folders to the document library.

This is the document library AFTER import:

image

Inside the “SPFolderImport” folder of the document library:

image

Inside “Folder 2 (DOCX)”:

image

As you see:

  • The “modified” date is “now”
  • The “modified by” user is the user that executes the script

Interesting:

  • The required “Start Date” column is filled with it’s default value (“Today”)
  • the  required “Title” column ist empty (!!!)

Let’s edit one files properties:

image

As you see:

  • the Content type is set to “Document 2” like specified in the mapping in the demo script
  • the “title” field is empty => you will not be able to save the document properties now without entering a value for “Title”!

Let’s have a look on a folders properties:

image

The content type is set as specified in the mapping in the demo script

 

Last words:

  • I’ve tested the script in many ways
    • Folder in default “Shared Documents” library on site collection root web
    • Folder in default “Shared Documents” library on sub web of a site collection
    • Sub Folder in default “Shared Documents” library on sub web of a site collection
    • all that with custom document library created in the browser.
  • Possible improvements
    • Import versions of files
    • Implement user mapping or file and/or folder import

If you intend to use this in an production environment please do lots of tests before you go ahea!

Error during farm config: New-SPConfigurationDatabase failed with exception “The process does not passess the ‘SeSecurityPriviledge’ priviledge which is required for this operation.”

Today I tried to configure a new SharePoint farm. I got this error:

image

“The process does not passess the ‘SeSecurityPriviledge’ priviledge which is required for this operation.” – Although the account used to configure SharePoint is local administrator!!

After searching the internet I found that issue and a solution for that.

Check the Local Group policy or the Domain Group Policy and validate the Setting

Computer Configuration => Policies => Windows Settings => Security Settings => Local Policies => User Rights Assignments => Manage auditing and security log

image

If this setting is defined the account that is used to install SharePoint needs to be member of this setting. – Just add the account or the “local administrators” (“.Administrators”) group.