InfoPath error & solution: InfoPath cannot open the following form … The file is not a valid XML document.

Yesterday I had a problem on a new SharePoint 2013 farm that took me some hours to solve it.


I was not able to customize forms (again) using InfoPath on lists after creating them before sucessfully.


Steps to reproduce:

First I created a site collection and a custom list in it.


Than I use the “Customize Form”  button in the Ribbon menu. The new form opens in InfoPath designer.


Than I modify the form and publish it.


In the  browser I verify it’s working.


Than I want to change the form and click the Ribbon button “Customize Form” again.


Now this happens:


Error Message:

InfoPath cannot open the following form: http://sharepoint.local/sites/test/Lists/My New List/Item/template.xsn
The file is not a valid XML document.
DTD is prohibited.
Line 1, Position 9

When opening the URL of the InfoPath template in the browser I get this error from the Forms Server:


(Normally there would be a download dialog.)



It’s so simple… 😉 – After some network, log and code analytics I could figure out that you need…

… a site collection at the web application root!

Remember the first screenshot above. There I showed you the possible locations for the new site collection. As you can see the root “/” is available. This means: There is no site collection at the web app root.


After creating a site collection at the web application root I’m able to customize the list form with InfoPath. – After doing so:


Problem solved 🙂


My friend and colleague Guido could reproduce the problem and validate the fix. – @Guido: Thank you (and have a good week while teaching SharePoint 😉 )!!!



As far as I have seen there are other solutions related to the same problem:

  • Check your user has at least Contribute rights on the site.
  • Check your site is in “Local Sites” or “Intranet” zone of the Internet Explorer.

Guest Blog Article on Hey, Scripting Guy! Blog – Weekend Scripter: Run C# Code from Within PowerShell

Link to “C#Script”:

UPDATE 2021-06-23: Found in backup. Now on GitHub: 


Update of PS2EXE: Version 0.4 now support Single and Multi Thread Apartment and “NoConsole” mode

On Codeplex user redpark asked for “Single Thread Apartment” mode (…

Here it is…

Please see v0.4 on CodePlex:


There are 3 new parameters:

-sta Single Thread Apartment mode


-mta Multithread Apartment mode


-noconsole resulting EXE is an Windows application not a console application.


The –noconsole parameter lets you create a Windows application EXE file with no console window.

Therefore I would need to implement several extensions for the PowerShell host included in the resulting EXE but currently I have not enough time to do this. But for now I’ve implemented the credential prompt so cmdlet Get-Credential will work as expected.

PDF UPLOAD METADATA EXTRACTOR (sample SharePoint 2013 & 2010 project) on Codeplex

When you upload MS Office documents to SharePoint document libraries their document titles are used in SharePoint to set the default Title column of list item of the uploaded document.

This does not work for PDF files, but it’s easy to reproduce the functionality.

I have created a simple VS2012 SharePoint project. It’s based on the C# (“iTextSharp”) version of the community version of iTextPDF ( that can be downloaded here:

You can download source code and solution packages (“binaries”) from Codeplex:

The project is published under LGPL license because iTextSharp v4.1.6 requires that. – The latest version of iTextSharp (5.3.4) is published under AGPL. Codeplex does not provide AGPL licencing. So I had to use the last version of iTextSharp published under LGPL.



1. On (Web-) feature activation an feature event receiver iterates through each document library in the web that is not hidden.

2. For each of them the feature event receiver registers a list item event receiver that fires on “ItemAdded” events.

3. Furthermore an list item receiver is installed for the web to fire on “ListAdded” events to register the list item event receiver mentioned before on newly created lists.

4. During upload of files to document libraries the list item event receiver look for files ending with “.pdf” (case insensitive).

5. If there is such an file it opens the file using iTextSharp library and reads its “Title” information.

6. This information is set for the default “Title” column of the SharePoint list item.

7. The change is commited by “SystemUpdate” on the SPListItem object.

8. If an error occures inside the event handler there is no action. The user will never see an error in the module. If it is not possible to extract the title of the PDF document the module will not set the title column of the list item.



To use the feature just deploy the SharePoint Solution Package (WSP-file) to your SharePoint farm. It’s not a “sandboxed solution”! After that you need to activate the feature in each web where you need it. If you need to activate it on each new web you could use “feature stapling” to activate it by default. If you need this please write me an comment.

Demo in SharePoint 2010:

1. Create a Word document with a title and save it as PDF:




2. Check the document title by using Adobe Reader or Adobe Acrobat or any other PDF reader


3. First try to upload the DOCX and it’s PDF into a document library without the new feature activated on the web:


As you can see: The “Title” of the DOCX is used for the Title column of the SharePoint list item. For the PDF file the Title column is empty.

4. Now activate the feature:


5. After that delete the files uploaded before in the document library. Than upload both files again:


Now both “Title” columns are set!

6. My last test is to create a new Asset libary in the web. Than I upload both files and check the PDF’s properties:


The Title column is set as expected!!

Demo in SharePoint 2013:

I’ve added a second project just for SP2013. Here is a single screenshot…


Gimmick: Write To SharePoint Log using PowerShell functions

In preparation for a deployment project I wrote some PowerShell functions to write messages to the SharePoint ULS.

You can download the PowerShell script here:

There are some samples in the package.


This is what it looks like in ULSViewer:


Red = Area or “Product” in ULSViewer

Green = Category

Blue = Severity Level

Purple = Message


You can use the script file “SPLogging.ps1” as PowerShell module. In the following sample the SPLogging.ps1 file is stored in the same location as “SPLoggingDemo.ps1”. Or you copy the content of “SPLogging.ps1” to your own file.

Import-Module "$(split-path $MyInvocation.MyCommand.Path)\SPLogging.ps1"

Here are some samples about how to create “Areas”


Add-SPDiagnostigLoggingArea -AreaName "TestArea"

"PowerShell", "PS1", "PS2" | Add-SPDiagnostigLoggingArea 

This is how you create categories:

Add-SPDiagnostigLoggingAreaCategory -AreaName "TestArea" -CategoryName "Category1" -TraceSeverityDefault High

Add-SPDiagnostigLoggingAreaCategory "TestArea\Category2" -TraceSeverityDefault High

"Test1", "Test2", "Test3" | Add-SPDiagnostigLoggingAreaCategory -AreaName "PowerShell" 
"Test1", "Test2", "Test3" | Add-SPDiagnostigLoggingAreaCategory -AreaName "PS1" 

You can add new categories by specifiying the area and the new category name seperatly or as formatted string: <area><backslash><category>

The following snipped shows you how to query the areas and categories you created in your PowerShell session.

Get-SPDiagnosticLoggingCategory -CategoryName "PowerShell\Test1"

Get-SPDiagnosticLoggingCategory -AreaName "PowerShell"


You have only access to your own areas and categories!!

Finally here are some examples of how to write messages to the SharePoint ULS. You can use PowerShell pipelining!

Write-SPDiagnosticLogging -CategoryName "PowerShell\Test1" -Message "Hello 1!" 

"Hello 2!" | Write-SPDiagnosticLogging -CategoryName "PowerShell\Test1" 

"Hello 3!", "Current date/time: {0}" | Write-SPDiagnosticLogging -CategoryName "PowerShell\Test2" -MessageArguments @(([DateTime]::Now)) -TraceSeverity "High"


Writing to the Windows Event Log is not supported at this moment.

How To: Use Git for small dev projects with “private” GIT repositories based on cloud storage providers such as SugarSync or DropBox

In some small dev projects in the last months I was looking for a source control system. I liked to use Microsoft’s Team Foundation Services. But I was and I am not able to use them with my current VS 2010. I cannot use the final version ( because of a bug in (my?) Visual Studio. (Forum thread related to this problem:


So I wonder if I could use Git. – It’s a “[…] a distributed revision control and source code management (SCM) system with an emphasis on speed.” (Source: Wikipedia).

There is where you can host public visible projects for free. For “private” projects you have to pay ( The pricing is fair! But I was looking for a cost opportunity.

For cloud storage purpose I’m using this cloud storage providers:

(For data protection on cloud storage I use BoxCryptor.)

For this article I use SugarSync.

In my scenario I have some source code to share between me and other project members. We like to code together. And we need some source control features… With a cloud storage provider I can share local folders with other people… So now:

My aim is to create a cloud storage based source code repository for a small project and share the repository with other developers to work on the same project. My aim *is not* to describe the basics of and the need for source control in software development. (And I do not describe why to do all the steps 😉 )

This are the step to do so:

  1. Download and install Git for Windows: “Full installer for official Git for Windows 1.8.0 Featured Beta”
  2. Download and install Git Extensions: “Git Extensions 2.43 Windows installer”. This are some very usefull “GUI” extensions for Git.
  3. Download and install Git Source Control Provider for Visual Studio (2012, 2010, 2008):
  4. Download and install – if you like – PowerShell extensions for Git:
  5. Register for a cloud storage, e.g. SugarSync.
  6. Download and install the cloud storage software on your dev machine and log in.
  7. Open Visual Studio and create your project as normal. Or open an existing project. It’s the same procedure in both situations.


  8. Right click on the project node or solution node in the Solution Explorer and click “Create Git Repository”


  9. Right click on the project and “commit” the changes.




  10. Open Git Shell (PowerShell or Bash), navigate to the project source folder.


    Here you can see that to source folder is recognized as Git enabled folder.

  11. The next step is to create the connection to the repository that is shared between project members.
  12. Create a local folder for your Git repository and map the folder to your cloud storage.



  13. Create an empty folder inside the local repository folder. In this folder the project repository will be created.


  14. Open Git Extensions and create a new repository:


  15. Now you need to connect the local project folder with the repository.
    • git remote add origin “//localhost/c$/github/test project 1”


  16. The next steps are to create a remote branch and to connect the local branch (“master”) with the remote branch
    • git push origin master:refs/heads/master


    • git branch -u origin/master master


  17. Now you can use “git push” to upload local changes to the repository and “git pull” to download changes from the repository to the local project folder.
  18. The final step is to share the cloud storage hosted folder with other project members:



That’s it for the publisher.

The next step is that an invited project member needs to sync the repository folder to it’s local drive and do the following steps:

  1. I simulate this by creating a new local project directory “c:\source\test project 1 other member”.


  2. Then initialize the directory with “git init


  3. Now you need to connect the local project folder with the repository.
    • git remote add origin “//localhost/c$/github/test project 1”

    • git pull origin master:refs/heads/master 


  4. The last step is to set the local branch to “track” the remote branch.


That’s it, again. This are the basics about how to create a local project, a cloud hosted repository, how to share the repository and how to connect to it on a project members site.