Update of PS2EXE: Version 0.4 now support Single and Multi Thread Apartment and “NoConsole” mode

On Codeplex user redpark asked for “Single Thread Apartment” mode (http://ps2exe.codeplex.com/discussions/435946)…

Here it is…

Please see v0.4 on CodePlex:

http://ps2exe.codeplex.com.

 

There are 3 new parameters:

-sta Single Thread Apartment mode

(see http://msdn2.microsoft.com/en-us/library/system.stathreadattribute(VS.71).aspx)

-mta Multithread Apartment mode

(see http://msdn2.microsoft.com/en-us/library/system.mtathreadattribute(VS.71).aspx)

-noconsole resulting EXE is an Windows application not a console application.

 

The –noconsole parameter lets you create a Windows application EXE file with no console window.

Therefore I would need to implement several extensions for the PowerShell host included in the resulting EXE but currently I have not enough time to do this. But for now I’ve implemented the credential prompt so cmdlet Get-Credential will work as expected.

Update of PS2EXE: Version 0.3.0.0 Now Supports PowerShell 3.0 and 2.0!

Some time ago I’ve written a little tool called “PS2EXE” that creates .EXE files from PowerShell script files. As mentioned in earlier posts this is no conversation of PS to EXE! The PS2EXE script creates an EXE by using the C# compiler and stores the script as Base64 encoded string inside a tiny PowerShell host application.

Today I’ve updated the PS2EXE script to version 0.3.0.0. Now it supports PowerShell 3.0 and PowerShell 2.0.

https://github.com/ikarstein/ps2exe

(Formerly: http://ps2exe.codeplex.com)

Here are the past blog articles:

There are two new parameters for PS2EXE:

-runtime30

-lcid <int>

Using -runtime30 or by starting PS2EXE in a PowerShell 3.0 environment PS2EXE creates a EXE file by using the C# compiler version 4.0.

Using -runtime20 or by starting PS2EXE in a PowerShell 2.0 environment PS2EXE create a EXE file by using the C# compiler 2.0.

-lcid sets the “culture” of the current thread to the specified value. (See http://msdn.microsoft.com/en-us/library/system.threading.thread.currentuiculture.aspx and http://msdn.microsoft.com/en-us/library/system.threading.thread.currentculture.aspx and http://msdn.microsoft.com/en-US/library/w4deeh00(v=vs.80).aspx and http://msdn.microsoft.com/en-us/goglobal/bb964664.aspx.)

The new version should fix the “Assembly not referenced” error discussed here:

http://ps2exe.codeplex.com/discussions/404546

On Windows 8 or Windows Server 2012 there is PowerShell 3.0 installed by default. On Windows 7 or Windows Server 2008 R2 you can install it using Windows Management Framework 3.0.

With PowerShell 3.0 installed you will always start the 3.0 enviroment by using:

the Start Menu (or Screen)

clip_image001

the “Run” dialog:

clip_image002

the command line:

clip_image003

Execution PS2EXE reports PowerShell version 3.0:

clip_image004

You can start a PowerShell 2.0 enviroment by using this parameter: -version 2.0 for POWERSHELL.EXE

clip_image005

Execution PS2EXE reports PowerShell version 2.0:

clip_image006

Now lets use to create an EXE file using the PS2EXE script…

1. Sample: PowerShell 3.0 without parameters –runtime20 and –runtime30

clip_image007

2. Sample: PowerShell 3.0 with parameter –runtime20

clip_image008

(Behind the scene this starts PowerShell.exe using parameter –version 2.0.)

3. Sample: PowerShell 2.0 without parameters –runtime20 and –runtime30

clip_image009

4. Sample: PowerShell 2.0 with parameter –runtime20

clip_image010

This is not supported!

PDF UPLOAD METADATA EXTRACTOR (sample SharePoint 2013 & 2010 project) on Codeplex

When you upload MS Office documents to SharePoint document libraries their document titles are used in SharePoint to set the default Title column of list item of the uploaded document.

This does not work for PDF files, but it’s easy to reproduce the functionality.

I have created a simple VS2012 SharePoint project. It’s based on the C# (“iTextSharp”) version of the community version of iTextPDF (http://itextpdf.com) that can be downloaded here: http://sourceforge.net/projects/itextsharp/files/itextsharp/

You can download source code and solution packages (“binaries”) from Codeplex:

http://sppdfmetadataextract.codeplex.com/

The project is published under LGPL license because iTextSharp v4.1.6 requires that. – The latest version of iTextSharp (5.3.4) is published under AGPL. Codeplex does not provide AGPL licencing. So I had to use the last version of iTextSharp published under LGPL.

 

Description:

1. On (Web-) feature activation an feature event receiver iterates through each document library in the web that is not hidden.

2. For each of them the feature event receiver registers a list item event receiver that fires on “ItemAdded” events.

3. Furthermore an list item receiver is installed for the web to fire on “ListAdded” events to register the list item event receiver mentioned before on newly created lists.

4. During upload of files to document libraries the list item event receiver look for files ending with “.pdf” (case insensitive).

5. If there is such an file it opens the file using iTextSharp library and reads its “Title” information.

6. This information is set for the default “Title” column of the SharePoint list item.

7. The change is commited by “SystemUpdate” on the SPListItem object.

8. If an error occures inside the event handler there is no action. The user will never see an error in the module. If it is not possible to extract the title of the PDF document the module will not set the title column of the list item.

 

Usage:

To use the feature just deploy the SharePoint Solution Package (WSP-file) to your SharePoint farm. It’s not a “sandboxed solution”! After that you need to activate the feature in each web where you need it. If you need to activate it on each new web you could use “feature stapling” to activate it by default. If you need this please write me an comment.

Demo in SharePoint 2010:

1. Create a Word document with a title and save it as PDF:

SNAGHTMLb6b29ba

 

SNAGHTMLb6bd515

2. Check the document title by using Adobe Reader or Adobe Acrobat or any other PDF reader

SNAGHTMLb6da159

3. First try to upload the DOCX and it’s PDF into a document library without the new feature activated on the web:

image

As you can see: The “Title” of the DOCX is used for the Title column of the SharePoint list item. For the PDF file the Title column is empty.

4. Now activate the feature:

image

5. After that delete the files uploaded before in the document library. Than upload both files again:

image

Now both “Title” columns are set!

6. My last test is to create a new Asset libary in the web. Than I upload both files and check the PDF’s properties:

image

The Title column is set as expected!!

Demo in SharePoint 2013:

I’ve added a second project just for SP2013. Here is a single screenshot…

image

SharePoint 2013 Design Packages: Import with PowerShell (Part 2 of 2)

Last Thursday I wrote about “Exporting SharePoint 2013 Design Packages with PowerShell”. Today I’d like to show you the import function. This functions can be used to handle with SharePoint 2013 Design Packages with PowerShell, e.g. in deployment scenarios. Therefore it should by very useful. (I hope so 😉 ) FEEDBACK WELCOME!!!

You can download the scripts here:

http://gallery.technet.microsoft.com/Export-and-Import-0f41b376

Here you can find the blog article about “Export-SPDesignPackage”: https://blog.kenaro.com/2013/02/14/sharepoint-2013-design-packages-export-with-powershell-part-1-of-2

 

The import function is called “Import-SPDesignPackage” and here are the details:

Import-SPDesignPackage

Here are some samples

#First sample

Import-SPDesignPackage -SiteUrl "http://sharepoint.local/publishing" -ImportFileName "C:\temp\publishing2.wsp" -PackageName "P2" -Apply $true

#Second sample


(
    @{ SiteUrl ="http://sharepoint.local/sites/publishing1";
       ImportFileName ="C:\temp\publishing1.wsp";
       PackageName ="P1";
       Apply=$true
    },
    @{ SiteUrl ="http://sharepoint.local/sites/publishing2";
       ImportFileName ="C:\temp\publishing2.wsp";
       PackageName ="P2";
       Apply=$true
    }
) | New-ObjectFromHashtable | Import-SPDesignPackage

The first sample shows you how to import one design package to a dedicated site. By using the “Apply” parameter the design package will be applied to the site immediately.

The second sample shows you hot to import two different packages to two different site collections. In the sample I use a hashtable for input parameters. They are assigned to the function parameters by “property name binding”. See “http://technet.microsoft.com/en-us/library/hh847743.aspx”: Section “ValueFromPipelineByPropertyName”:

[…]
For example, if the function has a ComputerName parameter, and the 
piped object has a ComputerName property, the value of the ComputerName
property is assigned to the ComputerName parameter of the function.

The following example declares a ComputerName parameter that is 
mandatory and accepts input from the ComputerName property of the 
object that is passed to the function through the pipeline.
[…]

Some more details about that. Skip it if you are not interested…

 

<begin/>

 

This “property name binding” does not work with hashtables. Therefore I created a helper function “New-ObjectFromHashtable” that creates a PowerShell object (“PSObject”). This function is generic. (It’s also included in the script files.)

 

On the one hand with “new-object System.Management.Automation.PSObject” you can create a new “empty” PowerShell object that can be used in your script as every other object, e.g. a SharePoint object like an instance of class SPSite. With cmdlet “Add-Member” you can add new members to the object. – On the other hand you have a hashtable with named values. You can access the collection of names = keys and with each key you can access the value. – Let’s combine it: You can iterate through the keys collection and create a new member in an empty PSObject instance.

 

functionNew-ObjectFromHashtable {
    #written by Ingo Karstein (https://blog.kenaro.com)# v1.0#Use this function to convert a hashtable to a PowerShell object ("PSObject"), e.g. for using hashtables for property name binding in# PowerShell pipelines
    [CmdletBinding()]
    param(
        [parameter(Mandatory=$true, Position=1, ValueFromPipeline=$true)]
        [Hashtable]
        $Hashtable
    )

    begin {
        $results= @()
    }

    process {
        $r=new-objectSystem.Management.Automation.PSObject$Hashtable.Keys | % {
            $key=$_$value=$Hashtable[$key]
            $r | Add-Member-MemberTypeNoteProperty-Name$key-Value$value-Force
        }

        $results+=$r
    }

    end {
        $results
    }

}

The resulting object can be passed to each “property name binding” enabled cmdlet. – The PowerShell engine tries to match input object property names and cmdlet parameter names. If there is a match the input object property value gets assigned to the cmdlets input parameter.

 

The cmdlet can also convert a list of hashtables to a list of objects. That is used in the “Import-SPDesignPackage” script.

 

<end/>

Parameters

Parameter Name Parameter Set Name Mandatory? Position Description
SiteUrl Default Yes 0 Site Url for import
Site Site Yes 0 SPSite object for import
ImportFileName DefaultSite Yes 1 Filename and path of the design package for import
Apply DefaultSite Yes 2 $true = Apply the design package after import$false = Only install the design package for later activation
PackageName DefaultSite No 3 Package name. If not specified it uses the file name without extension. The package name will be used for naming the imported file in the solution gallery of the site collection
MajorVersion DefaultSite No Version number of the design package. If not specified it uses “1” for the major version.
MinorVersion DefaultSite No Version number of the design package. If not specified it uses “0” for the minor version.

 

The function returns an object for each processed (or not processed) site collection:

image

Object Property Description
SiteUrl Url of the processed site
Success $true = Import and “Apply” (if specified) was successful
InputFileFound $true = File found$false = File not found
InputFileExtensionValid $true = Input file has extension “.wsp”$false = Input file hat not extension “.wsp”
SiteFound $true = The specified site was found
SolutionFileName The name of the solution is auto generated from package name or file name and major and minor version number. This is the name of the package in the site collections solution gallery.
PackageAlreadyExsits $true = the solution does already exist in the solution gallery.

Some additions

The import process requires the package to be stored inside the site collection before the the last input step. Therefore the function creates a folder named “tmp_importspdesignpackage_15494B80-89A0-44FF-BA6C-208CB6A053D0” in the site collections root web root folder. In this folder the package gets uploaded. From the location the package is imported. The folder will be deleted after successful or not failed import.

SharePoint 2013 Design Packages: Export with PowerShell (Part 1 of 2)

Last night I search for “export import sharepoint 2013 design packages powershell”. No luck there. – So I created two functions for this purpose. It works nicely 🙂

The functions only use public methods of the SharePoint (Server) Object Model. No reflection, et cetera. The most important class in this context is “Microsoft.SharePoint.Publishing.DesignPackage”. It has two methods: Export and Install. This I used to create to PowerShell functions: Export-SPDesignPackage and Import-SPDesignPackage. Both are able to work in PowerShell pipeline context. This gives you the possibility to export a bunch of Design Packages from several sites using PowerShell.

You can download the scripts here:

http://gallery.technet.microsoft.com/Export-and-Import-0f41b376

You can use the scripts as PowerShell modules or by copying the content to your own PowerShell script files. But please be careful: By now it’s only tested in my dev environment!!

The scripts use the SharePoint Server Object Model. So they have to be executed on a SharePoint farm server! You also need an priviledged account that has rights to export (or import) SharePoint Design Packages.

Today I’ll describe the export function. Tomorrow or the day after tomorrow I’ll publish a description of the import function. Here is the import part: https://blog.kenaro.com/2013/02/18/sharepoint-2013-design-packages-import-with-powershell-part-2-of-2/

Export-SPDesignPackage

Here are some samples:

$site1=get-spsite"http://sharepoint.local/publishing" 
$site2=get-spsite"http://sharepoint.local/sites/publishing2"

 

#First Sample

 

$cred=new-objectSystem.Management.Automation.PSCredential( "domain\spfarm", (ConvertTo-SecureString-AsPlainText"Passw0rd"-Force))

$site1, $site2 | Export-SPDesignPackage -UseTempFileForExportWithExtension ".wsp" -DownloadCredentials $cred -PackageName "test"#Second Sample

 

$site1, $site2 | Export-SPDesignPackage -ExportFileName "C:\temp\Package.wsp" -UseExportFileNumbering -IncludeSearchConfig -DisposeSiteObject -OverwriteExistingFiles

#Third Sample

 

(
    @{PackageName="P1"; ExportFileName="C:\temp\p1.wsp"; SiteUrl="http://sharepoint.local/publishing"},
    @{PackageName="P2"; ExportFileName="C:\temp\p2.wsp"; SiteUrl="http://sharepoint.local/sites/publishing2"}
) | New-ObjectFromHashtable | Export-SPDesignPackage

 

#Fourth Sample

 

$site2 | Export-SPDesignPackage -ExportFileName "C:\temp\publishing2.wsp"  -IncludeSearchConfig -DisposeSiteObject -OverwriteExistingFiles

The first sample uses two SPSite objects as (pipeline) input and tells the function to export the design packages to temp files with auto generated names. For the download of the packages from the sites you can specify “download credentials”.

The second sample exports the same two SPSites. The export file name is specified. By using “UseExportFileNumbering” a number is inserted into the file name like this: “c:\temp\package-1.wsp”. So both export packages have different file names.

The third sample exports two sites  with individual export settings for each site. This is possible by “parameter binding by property name” where PowerShell binds the input object’s property to the function / cmdlet input parameters by name matching. But this does not work with hashtables. Therefore I created a helper function “New-ObjectFromHashtable” that creates a PowerShell object (“PSObject”). This function is generic. (It’s also included in the script files.)

The fourth sample exports just one site.

Parameters

The function has the following parameters. All can be bound by “parameter binding by property name”!

Parameter Name Parameter Set Name Mandatory? Position Description
SiteUrl Default Yes 0 Url of Site Collection as System.String
Site Site Yes 0 SPSite-object
ExportFileName Default

Site

No 1 Name for the exported file in the file system. The folder must exist!

Cannot be used together with “ExportFolder” and “UseTempFileForExportWithExtension”!

ExportFolder Default

Site

No 1 Name of the folder for the exported design packages. The file name will be created through the SharePoint Server Object Model.

Cannot be used together with “ExportFileName” and “UseTempFileForExportWithExtension”!

UseTempFileForExportWithExtension Default

Site

No 1 When specified the function will create a file name automatically by using "System.IO.Path.GetTempFileName()”. But it adds the extension you specify here. You should use “.wsp” by default.
PackageName Default

Site

No 2 Name of the export package. This is not a file name but an internal name that is used inside the design package. This is optional. The Object Model can handle it for you.
IncludeSearchConfig Default

Site

No 3 http://msdn.microsoft.com/en-us/library/jj862342.aspx
DisposeSiteObject Site No If you use SPSite objects as input you can specify whether the function should dispose it or not. Default: TRUE = “Please dispose it for me”!
DownloadCredentials Default

Site

No You can specify credentials for the download of the generated package. The package is stored in the the solution gallery of the site collection. It’s downloaded by the function by using System.Net.WebClient with “DefaultNetworkCredentials” if no credentials are specified.
OverwriteExistingFiles Default

Site

No If specified it overwrites the export file if it exists. If not specified it skips the export. But in the case nevertheless the design package will be created .
UseExportFileNumbering Default

Site

No If you use “ExportFileName” you can specify this parameter to insert numbers into the given file name as described above.

 

The function returns an object for each processed (or not processed) site collection:

image

This objects can be used in pipelines or … as you like.

Object Property Description
Site Url Url of the processed site
Success TRUE = Export successful, export file created.
SiteFound The site collection was found
ExportError $null = OK

System.Exception = Error occurred

PackageFileName Name of the Package created by the Object Model
PackageName Name of the Package created by you OR the Object Model
PackageMajorVersion …created by Object Model during export
PackageMinorVersion …created by Object Model during export
ExportFileOverridden If “TRUE” the export file did exist but was overridden during export. If “OverwriteExistingFiles” parameter was NOT specified this property will always be FALSE.
DownloadError $null = OK

System.Exception = Error during download

 

Some Additions

The design package will be stored in the solution gallery of the site collection that can be found here: <site-collection-url>/_catalogs/solutions. It’s the same location as for the sandboxed solutions. It interesting: sandboxed solutions are deprecated but in this case they use at least the same storage for a new functionality.

After the package is created you can download it from the solution gallery. No big deal.

Gimmick: Write To SharePoint Log using PowerShell functions

In preparation for a deployment project I wrote some PowerShell functions to write messages to the SharePoint ULS.

You can download the PowerShell script here: http://gallery.technet.microsoft.com/Write-Messages-to-b59565bf

There are some samples in the package.

image

This is what it looks like in ULSViewer:

image

Red = Area or “Product” in ULSViewer

Green = Category

Blue = Severity Level

Purple = Message

 

You can use the script file “SPLogging.ps1” as PowerShell module. In the following sample the SPLogging.ps1 file is stored in the same location as “SPLoggingDemo.ps1”. Or you copy the content of “SPLogging.ps1” to your own file.

Import-Module "$(split-path $MyInvocation.MyCommand.Path)\SPLogging.ps1"

Here are some samples about how to create “Areas”

 

Add-SPDiagnostigLoggingArea -AreaName "TestArea"

"PowerShell", "PS1", "PS2" | Add-SPDiagnostigLoggingArea 

This is how you create categories:

Add-SPDiagnostigLoggingAreaCategory -AreaName "TestArea" -CategoryName "Category1" -TraceSeverityDefault High

Add-SPDiagnostigLoggingAreaCategory "TestArea\Category2" -TraceSeverityDefault High

"Test1", "Test2", "Test3" | Add-SPDiagnostigLoggingAreaCategory -AreaName "PowerShell" 
"Test1", "Test2", "Test3" | Add-SPDiagnostigLoggingAreaCategory -AreaName "PS1" 

You can add new categories by specifiying the area and the new category name seperatly or as formatted string: <area><backslash><category>

The following snipped shows you how to query the areas and categories you created in your PowerShell session.

Get-SPDiagnosticLoggingCategory -CategoryName "PowerShell\Test1"

Get-SPDiagnosticLoggingCategory -AreaName "PowerShell"

Get-SPDiagnosticLoggingCategory 

You have only access to your own areas and categories!!

Finally here are some examples of how to write messages to the SharePoint ULS. You can use PowerShell pipelining!

Write-SPDiagnosticLogging -CategoryName "PowerShell\Test1" -Message "Hello 1!" 

"Hello 2!" | Write-SPDiagnosticLogging -CategoryName "PowerShell\Test1" 

"Hello 3!", "Current date/time: {0}" | Write-SPDiagnosticLogging -CategoryName "PowerShell\Test2" -MessageArguments @(([DateTime]::Now)) -TraceSeverity "High"

 

Writing to the Windows Event Log is not supported at this moment.

Create Provider Hosted High Trust App for SharePoint 2013 (Short Guide)

About this topic there are several guides. I can’t say that I have to add anything new 😉 But… as always… this blog is a kind of notebook for me. So I post this small guide.

1. You need to have or create a certificate that is used as “security token issuer”. This certificate can be created using IIS Manager or any other tool.

I use “XCA” (http://xca.sourceforge.net/). With that tool you can create your own Certification Authority. (Of course you can use the Windows Server Certification Authority.) – I use XCA because it’s easy to manage this kind of certificates there and I use the certificates on several dev machines.

If you do so too you need to create a root certificate for your Certification Authority and install it in the “Trusted Root Certification Authrities” of your Local Computer (not only your personal cert store).

image

2. The first step is to register (or create) the certificate within IIS Manager:

Right click on the server node and choose “Server Certificates”.

image

Use “Import” to apply an existing certificate. Or use “Create Self-Signed Certificate” to create a new certificate.

image

This are the steps to create a new self-signed certificate:

image

After commit (“OK”) you need to export the certificate with private key and a second time without private key.

image

image

image

image

image

image

image

image

3. Open Visual Studio 2012. Create a new project:

image

image

image

For “Issuer ID” you need to create a GUID using Visual Studio or PowerShell. Here is the PowerShell way:

Start PowerShell.

image

Enter:

[guid]::newguid().tostring().tolower()

image

Copy to output into the dialog in Visual Studio 2012.

image

4. Open a Windows PowerShell ISE, create a new PowerShell script file and copy the following code to it. Most of the code comes from here: http://msdn.microsoft.com/en-us/library/fp179901.aspx. With some additions from Steve Peschka’s Blog articles: http://blogs.technet.com/b/speschka/archive/2012/09/27/another-apps-for-sharepoint-tip-with-the-error-quot-the-issuer-of-the-token-is-not-a-trusted-issuer-quot.aspx and http://blogs.technet.com/b/speschka/archive/2012/11/01/more-troubleshooting-tips-for-high-trust-apps-on-sharepoint-2013.aspx.

###http://msdn.microsoft.com/en-us/library/fp179901.aspx

$publicCertPath = "C:\root\High_Trust_App_1.cer"

#$issuerId = [System.Guid]::NewGuid().ToString()
$issuerId = ([Guid]"4729b8e2-073a-47f0-8538-105ec865f3d2").ToString()

$spurl ="http://sharepoint.local"

$spweb = Get-SPWeb $spurl

$sc = Get-SPServiceContext $spweb.site

$realm = Get-SPAuthenticationRealm -ServiceContext $sc

$certificate = Get-PfxCertificate $publicCertPath

$fullIssuerIdentifier = $issuerId + '@' + $realm

New-SPTrustedSecurityTokenIssuer -Name $issuerId -Certificate $certificate -RegisteredIssuerName $fullIssuerIdentifier –IsTrustBroker

iisreset

write-host "Full Issuer ID: " -nonewline
write-host $fullIssuerIdentifier -ForegroundColor Red
write-host "Issuer ID for web.config: " -nonewline
write-host $issuerId -ForegroundColor Red

#Disable OAuth HTTPS requirement FOR DEV!!

$serviceConfig = Get-SPSecurityTokenServiceConfig
$serviceConfig.AllowOAuthOverHttp = $true
$serviceConfig.Update()


New-SPTrustedRootAuthority -Name "$($certificate.Subject)_$($certificate.Thumbprint)" -Certificate $certificate 

Be sure to change any parameter that does not fit your environment. After that the script should look like this:

image

 

The following script lines are needed in order to get it working using a SharePoint site without SSL!!

$serviceConfig = Get-SPSecurityTokenServiceConfig

$serviceConfig.AllowOAuthOverHttp = $true

$serviceConfig.Update()

If you use SSL (e.g. https://sharepoint.local) you can skip this.

No other steps are required. I’ve tested this several times with always fresh SP 2013 environments because I had some difficulties to get this set up.

5. At this point I have not changed anything in Visual Studio after creating the project(s) (there are two) through the wizard.

Check the “web.config” file in you web project.

image

There you find the issuer ID again.

6. Now run the project. You need to trust the app.

image

image

image

Short Note About Error While Creating Search Service Application for SharePoint 2013 by PowerShell: “Value cannot be null. Parameter name: indexLocation”

Today I got an error while creating a Search Service Application for SharePoint 2013:

PS C:\> $SearchSA = New-SPEnterpriseSearchServiceApplication –Name “Enterprise Search Service Application” –ApplicationPool “Search App Pool” –DatabaseName “Search”

New-SPEnterpriseSearchServiceApplication : Value cannot be null.
Parameter name: indexLocation

To resolve this I just started the Search Service Instance on each (search) server in the farm and set it’s “DefaultIndexLocation” property.

After that I could create the Search Service App.

PS snippet:

"SearchServer1", "SearchServer2" | % {
    $svcInst = (Get-SPServer -Identity $_).serviceinstances | ? { $_.GetType().FullName -eq "Microsoft.Office.Server.Search.Administration.SearchServiceInstance" }
    $svcInst.DefaultIndexLocation = $defaultIndexLocation
    $svcInst.Update()
    $svcInst.Provision()
}

This did it.

How To: Use Git for small dev projects with “private” GIT repositories based on cloud storage providers such as SugarSync or DropBox

In some small dev projects in the last months I was looking for a source control system. I liked to use Microsoft’s Team Foundation Services. But I was and I am not able to use them with my current VS 2010. I cannot use the final version (http://tfs.visualstudio.com) because of a bug in (my?) Visual Studio. (Forum thread related to this problem: http://social.msdn.microsoft.com/Forums/en-US/TFService/thread/1b6673ec-8c42-4896-9049-49f17b85bf65)

Anyway…

So I wonder if I could use Git. – It’s a “[…] a distributed revision control and source code management (SCM) system with an emphasis on speed.” (Source: Wikipedia).

There is GitHub.com where you can host public visible projects for free. For “private” projects you have to pay (https://github.com/plans). The pricing is fair! But I was looking for a cost opportunity.

For cloud storage purpose I’m using this cloud storage providers:

(For data protection on cloud storage I use BoxCryptor.)

For this article I use SugarSync.

In my scenario I have some source code to share between me and other project members. We like to code together. And we need some source control features… With a cloud storage provider I can share local folders with other people… So now:

My aim is to create a cloud storage based source code repository for a small project and share the repository with other developers to work on the same project. My aim *is not* to describe the basics of and the need for source control in software development. (And I do not describe why to do all the steps 😉 )

This are the step to do so:

  1. Download and install Git for Windows: http://code.google.com/p/msysgit/downloads/list: “Full installer for official Git for Windows 1.8.0 Featured Beta”
  2. Download and install Git Extensions: http://code.google.com/p/gitextensions/downloads/list: “Git Extensions 2.43 Windows installer”. This are some very usefull “GUI” extensions for Git.
  3. Download and install Git Source Control Provider for Visual Studio (2012, 2010, 2008): http://visualstudiogallery.msdn.microsoft.com/63a7e40d-4d71-4fbb-a23b-d262124b8f4c
  4. Download and install – if you like – PowerShell extensions for Git: https://github.com/dahlbyk/posh-git
  5. Register for a cloud storage, e.g. SugarSync.
  6. Download and install the cloud storage software on your dev machine and log in.
  7. Open Visual Studio and create your project as normal. Or open an existing project. It’s the same procedure in both situations.

    image

  8. Right click on the project node or solution node in the Solution Explorer and click “Create Git Repository”

    image

  9. Right click on the project and “commit” the changes.

    image

    image

    image

  10. Open Git Shell (PowerShell or Bash), navigate to the project source folder.

    image

    Here you can see that to source folder is recognized as Git enabled folder.

  11. The next step is to create the connection to the repository that is shared between project members.
  12. Create a local folder for your Git repository and map the folder to your cloud storage.

    image

    image

  13. Create an empty folder inside the local repository folder. In this folder the project repository will be created.

    image

  14. Open Git Extensions and create a new repository:

    image

  15. Now you need to connect the local project folder with the repository.
    • git remote add origin “//localhost/c$/github/test project 1”

      image

  16. The next steps are to create a remote branch and to connect the local branch (“master”) with the remote branch
    • git push origin master:refs/heads/master

      image

    • git branch -u origin/master master

      image

  17. Now you can use “git push” to upload local changes to the repository and “git pull” to download changes from the repository to the local project folder.
  18. The final step is to share the cloud storage hosted folder with other project members:

    image

    image

That’s it for the publisher.

The next step is that an invited project member needs to sync the repository folder to it’s local drive and do the following steps:

  1. I simulate this by creating a new local project directory “c:\source\test project 1 other member”.

    image

  2. Then initialize the directory with “git init

    image

  3. Now you need to connect the local project folder with the repository.
    • git remote add origin “//localhost/c$/github/test project 1”

      image
    • git pull origin master:refs/heads/master 

      image

  4. The last step is to set the local branch to “track” the remote branch.

    image

That’s it, again. This are the basics about how to create a local project, a cloud hosted repository, how to share the repository and how to connect to it on a project members site.

SQL Server Alias management with PowerShell and WMI – Update for SQL Server 2012

In 02/2011 I created a PowerShell script to set / get / remove SQL Server aliases:

https://blog.kenaro.com/2011/02/09/enumerate-add-update-and-remove-sql-server-aliases-by-using-powershell/

Today I updated it for SQL Server 2012 and tested it on Windows Server 2012 and Windows Server 2008 R2.

You can download it here:

http://gallery.technet.microsoft.com/SQL-Server-2008-2012-Alias-baf05737