Friday, February 27, 2015

"Object doesn't support property or method 'addEventListener'"

I've just been advised about an issue by two of our customers who are using a form that we've developed in InfoPath and deployed as a solution almost a year ago. It was all working smoothly.

What they're both getting today when trying to fill-out the form is:



I've tested on my dev environment and I could not reproduce the issue with Chrome or IE10 although I have the same version of the solution.

It turned out both customers have upgraded to IE11 and the issue did not exist in Chrome for them.

Both customers are still on SharePoint 2010, so I was eager to turn on the Compatibility View on IE11 and boom, it worked.

A quick research led me to this post and I was very surprised that it also applies to SharePoint 2013. Adding the site URL to the Compatibility  View is a quick and easy workaround, but that really shouldn't be the default behaviour, Microsoft. That breaks a lot of other things, for instance if you're using HTML5 on that site (which is very common nowadays), that would not work and it's a big trade-off. Hopefully that gets addressed within a hotfix anytime soon.

Thursday, February 19, 2015

Establishing STS Trust between SharePoint farms

You might come to a scenario where you have multiple farms and you want to manage some of the service applications centrally on one farm, publish them and consume them from one or more farms.
One of the prerequisites to succeed is to establish STS trust between the farms. This is what this post is all about. In one of my next post I'll write about the service publishing and consuming itself in more details.

So... imagine how it looks like (assume we have only 2 farms, could be n farms):


The steps needed to implement this topology are:

1Export the root certificate on the Services Farm

We will first need to export our Root certificate from the Services Farm. We will use the Get-SPCertificateAuthority cmdlet to export the certificate for our farm.

On the Services Farm, run the following in the SharePoint 2013 Management Shell:

$rootCert = (Get-SPCertificateAuthority).RootCertificate

$rootCert.Export("Cert") | Set-Content "C:\Cert\ServicesFarmRootCert.cer" -Encoding byte

2.  Create a Certificate on the Consuming Farm

On the Consumer Farm, we not only need to export the Root certificate, but also a Secure Token Service (STS) certificate as well. The later can be exported by using the Get-SPSecurityTokenServiceConfig cmdlet. To ease this process, we will also get the Farm ID for our Consuming Farms and create text files with it. The Farm ID will need to be added to the Publishing permissions on the Services Farm so that we can access our services later on.

Here's the PowerShell script you need to run to achieve that, on the first 2 variables you need to replace the values with your server hostnames:

$publisher = "ServicesFarmCAServer"
$consumer = "ConsumingFarmCAServer"
$path = "C:\Cert"
If ((test-path $path) -eq $false)
{
 [IO.Directory]::CreateDirectory("$path")
}
$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export("Cert") | Set-Content "C:\Cert\ConsumingFarmRootCert.cer" -Encoding byte
$stsCert = (Get-SPSecurityTokenServiceConfig).LocalLoginProvider.SigningCertificate
$stsCert.Export("Cert") | Set-Content "C:\Cert\ConsumingFarmSTSCert.cer" -Encoding byte
$farmID = (Get-SPFarm).Id
New-Item C:\Cert\ConsumingFarmID.txt -type file -force -value "$farmID"
Copy-Item \\$consumer\c$\Cert\ConsumingFarmID.txt \\$publisher\c$\Cert

3,  Exchange the certificates between the Consuming and Services farms

Now we have all certificates that we need from the 2 farms. Remember, if you have more than one consuming farms, you need to repeat Step 2 for each of the farm. That's an easy copy-paste operation, however if you have more farms, it makes sense to script it.

$publisher = "ServicesFarmCAServer"
$consumer = "ConsumingFarmCAServer"
Copy-Item \\$publisher\c$\Cert\ServicesFarmRootCert.cer \\$cconsumer\c$\Cert
Copy-Item \\$cconsumer\c$\Cert\ConsumingFarmRootCert.cer \\$publisher\c$\Cert
Copy-Item \\$cconsumer\c$\Cert\ConsumingFarmSTSSTSCert.cer \\$publisher\c$\Cert

4.  Certificate Import on the Services farm

We now want to import all the Consuming farms certificates on the Services Farm and establish a trust. We are required to use the Farm ID to set up our permissions later on. We will rely on the text files we created a few steps back.

Replace ConsumingFarmName with the name you want to refer to the trusted provider/consumer and that's what will be visible later in the Trust section under Central Administration -> Security.

$trustCert = Get-PfxCertificate "C:\cert\ConsumingFarmRootCert.cer"
New-SPTrustedRootAuthority ConsumingFarmName -Certificate $trustCert
$stsCert = Get-PfxCertificate "c:\cert\ConsumingFarmSTSCert.cer"
New-SPTrustedServiceTokenIssuer ConsumingFarmName -Certificate $stsCert
$farmID = Get-Content C:\Cert\ConsumingFarmID.txt
$security = Get-SPTopologyServiceApplication | Get-SPServiceApplicationSecurity
$claimProvider = (Get-SPClaimProvider System).ClaimProvider
$principal = New-SPClaimsPrincipal -ClaimType "http://schemas.microsoft.com/sharepoint/2009/08/claims/farmid" -ClaimProvider $claimProvider -ClaimValue $farmID
Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights "Full Control"
Get-SPTopologyServiceApplication | Set-SPServiceApplicationSecurity -ObjectSecurity $security

5. Certificate Import on the Consuming Farm

We have one final step to wrap up concerning our certificates. On the Consuming Farm(s), we will need to execute the following script to import the Services Farm Root Certificate only.

Replace ServicesFarmName with the name you want to refer to the trusted provider/consumer and that's what will be visible later in the Trust section under Central Administration -> Security.

$trustCert = Get-PfxCertificate "C:\Cert\ServicesFarmRootCert.cer"
New-SPTrustedRootAuthority ServicesFarmName -Certificate $trustCert

That should be it. Considering you've got your user profiles in sync, and you've done everything in this article properly, you are now ready to publish some of your service applications and consume them remotely. This works over WAN as well. As mentioned earlier, one of my next blog posts will focus on the publishing/consuming setup. 

Thursday, February 12, 2015

Free 100 GB OneDrive storage for 2 years, offer expires 28/02

I hope I caught your attention with the title, I just found out Microsoft have created this great offer, described here. I've already activated this on my personal OneDrive account. The only caveat is that you'd need to have U.S.-based IP address to sign up for Bing Rewards, and then you'll get a bit spam from Bing/OneDrive. You can unsubsribe from the mails with 1 click, though.

I think that after the 2 years, such offers will be standard, and I think this will be extended. But anyway, chances are you'll love it and continue the service even if they require something like $7 a month, which now gives you 1 TB and Office 365 Subscription.

Monday, February 9, 2015

Can't map a user proprety to a managed metadata term set

This will be a short one - my favorite type of blog posts :)

The scenario is the following:

Two on-prem SharePoint 2013 farms in remote locations
STS Trust established, tested and working
Farm A has Managed Metadata Service published
Farm B is connected to the Managed Metadata Service on Farm A

I am trying to map one custom user property to a managed metadata term set.
The user property is shown in the user profile and the users have permissions to add values in there.
When I go to add something in this field, I am able to add any term from the whole term store.

When I go in the User Properties in the User Profile Service Application, I can't map it to a specific term set, as the drop down is not visible (see the "Pick a Term Set for this property:" option).



Solution:

In the Service Applicatoins in Farm B, I had to edit the properties of the connection to the published service of Farm A and enable this option:


Now I can choose a Term Set of all available in that term store and map the proprety to it.
The issue and solution are valid even for a local Managed Metadata Service, just edit the properties of the Managed Metadata Service proxy.


Managed Metadata Service inaccessible

I came in to work this chilly Monday morning to find out that the navigation on our Intranet is gone...


We're using managed navigation used accross all sites. No way I could recreate it as a structured navigation.

I went to check the Managed Metadata service immediately, only to confirm it's not accessible:


Many people posted about such an issue and some of them suggest it has occured after applying Windows or SharePoint updates. We, however don't apply any updates automatically and I haven't applied any updates recently either.

I verified all the permissions on the Managed Metadata Service and they all seemed intact.

App. pool recycle / iisreset did not help at all.

I tried to restore the Metadata database but that didn't help. I could just access the application, but then it was empty. In SQL, I could see that the terms are still there when I did a SELECT on the table (I know it's not the best idea).

I also tried this solution but it didn't work for me. Like switching back to the original database would not succeed, but it would never return an error either. I could access the application with this solution, but not see any of our terms.

Then I tried this solution of a MCM but that didn't work for me either...

Finally I've got to this and that worked. All data was there and our navigation has appeared immediately after an iisreset.

Detach the database
Deleted the Managed Metadata service application in CA
Create a new managed metadata service application (but I have used the same app. pool and the same account as before, contrary to the article)
Add the newly created service application to the farms default list
Switched to a different database name, for example Metadata_Test
Restored the original Metadata database from the last backup taken overnight.
Changed the properties of the service app, and switched to the name of the restored (original) DB.
iisreset

And it's all good now.

Thursday, February 5, 2015

Nintex Workflow calling a web service fails because of SSL Trust

We're using a Nintex Workflow for sending some email notificatoins for a SharePoint 2010 customer.
The workflow has a "Call web service" action which is calling the top level Nintex Web Service at https://rootsitecollection/_vti_bin/NintexWorkflow/Workflow.asmx. Suddenly this has stopped working. The error we see logged when checking the workflow history is that one:


The SSL certificate is valid and has not expired, however we've just recently renewed it.
It's a DigiCert SSL certificate whcih is imported properly and assigned in IIS as it should be.
Now, for some reason it seems it's not trusted by the Nintex Workflow, but what's really happening backstage is that the SharePoint farm is not trusting that one, as it only trusts its local Root certificate by default. I saw the previous Root certificate of the SSL that we've renewed  was added in the Trust manually, however the root CA has changed (the vendor has done that on purpose after one of the SSL bugs discovered recently I believe)  so I've decided to add the new Root CA to the trusted root authorities of the farm:

foreach ($cert in (Get-ChildItem cert:\LocalMachine\Root))
{
    if (!$cert.HasPrivateKey)
    {
        New-SPTrustedRootAuthority -Name $cert.Thumbprint -Certificate $cert
    }
}

That worked. If you don't want to run the whole workflow, you can just get to the action that calls the Web Service and run it (the workflow has to be in Edit mode), good practice is to export it first.

Tuesday, February 3, 2015

Content Type Publishing missing

I've been creating and publishing content types for a small reorganization project on a SharePoint 2010 Intranet. What I found in only one of the site collections, my newly created content type was not available even after publishing it and manually running the relevant timer jobs: Content Type Hub and Content Type Subscriber. I also found the Content Type Publishing from the Site Collection Administration section is missing!

This site was created using the Blank Site template (#STS1). All others were using the Team site and they were able to benefit from the publishing of content types without any issues.
The blank site template, for some reason is lacking one feature -  TaxonomyFeatureStapler.

Now I thought can I afford the luxury to recreate the whole site just because of that? Absolutely not... so I looked around for ways to workaround this:


stsadm -o activatefeature -id 73EF14B1-13A9-416b-A9B5-ECECA2B0604C -url http://toplevelsiteurl

That worked great.

There's also another solution, described in Bill Crider's blog here, but I haven't tested it.