A Couple of Be Prepared Actions When Changing the SharePoint STS Token Signing Certificate for Hybrid Features

I’ve been working recently with some Microsoft Services and their customers who are changing the SharePoint STS token signing certificate.  We are doing this as part of the set of steps required to set up the SharePoint Hybrid features for integrating search and BCS between on premises SharePoint farms and Office 365.  I’ve had a couple of these folks get a little concerned because after making this change they had a seemingly random scattering of people that were unable to authenticate into SharePoint afterwards.  The answer as it turns out was just that those users still had a valid fedauth cookie from a previous authentication into SharePoint and that cookie became invalid when the STS token signing certificate was changed.  So just in case, here are a couple of things to remember before you undertake this operation:

  • You can always back up the SharePoint STS token signing certificate before you change it.  You can use the PowerShell I posted about here to export the certificate:  http://blogs.technet.com/b/speschka/archive/2011/12/21/some-easy-powershell-to-export-the-token-signing-certificate-from-sharepoint-2010.aspx.  Never hurts to have that stored away some way for a rainy day until you’ve done a complete check of your services.  NOTE:  the link above does not include exporting the private key, which you will need to do as well if you want to reuse that cert in the future.  You can modify the PowerShell to get the private key or you can just export it out the Certificates MMC snap in.
  • If after you change it and you have some users who are unable to log in, you can have them a) delete their cookies (which most users are loathe to do, including me), or b) have them use an In Private session to access SharePoint.  Their fedauth cookie is only good for as long as you configured your Identity Provider to make it, so in most cases the cookie expires in a few hours.  That makes this work around only temporary until that cookie expires then you should be good to go again.

Remote SharePoint Index, Hybrid, and SAML

Today’s post combines a few different topics, that dare I say, started as a dare (I think).  The gauntlet thrown out was whether or not Remote SharePoint Index would work for SAML users as well as Windows users.  For those of you not completely familiar with Remote SharePoint Index, I covered it in some detail in my blog post here:  https://samlman.wordpress.com/2015/03/01/setting-up-an-oauth-trust-between-farms-in-sharepoint-2013/.  The gist of it is that you set up an OAuth trust between two SharePoint farms, and it lets you get search results from both farms into a single search center results page.  All of our examples in this area have been around using Windows users, but when the question was posed my best guess was “yeah, seems like that should work”.  As it turns out, it’s true, it really does.  In order to get this to work it requires the following:

  • Two SharePoint farms
  • Both farms have an SPTrustedIdentityTokenIssuer configured (ideally using the same Name, but in theory should not be required.  In theory.)
  • Both farms have at least one web app configured to use the SPTrustedIdentityTokenIssuer
  • Both farms have UPA configured and running
  • Both farms are doing a profile import from the same Active Directory forest.
  • Both farms have the profile import Authentication Provider Type configured to be Trusted Claims Provider Authentication and Authentication Provider Instance configured to be their SPTrustedSecurityToken Issuer.  When you’re done it should look like this:    

After you’ve set all of that up, you also need to into Manage User Properties in the UPA and map both the Claim User Identifier and (usually) the Work email properties.  In my case I used email address as the identity claim, so I mapped both of those properties to the “mail” attribute in Active Directory.  If you are using a different identity claim then you would map the Claim User Identifier accordingly and would not need to map Work email.  By default, the Claim User Identifier is mapped to samAccountName, so you’ll want to Remove that mapping, then add one for mail.  There’s nothing to helpful about doing this, you literally just type in “mail” in the Attribute field and then click the Add button.  When you’re done it will look like this:  .

You’ll probably want to do a full import after that (unless you have a really huge directory).  The biggest gotcha here is that you can really only set up one profile connection per Active Directory domain (at least when using Active Directory import instead of FIM).  The reason that may be a problem is that most folks do an import to get all their Windows users into the UPA.  However if you want to use the same domain to import users for both Windows and SAML auth into the UPA, it won’t work.  You’ll get an error when you try to create a second connection to the same domain.  So you basically have to be all in – all SAML or all Windows; otherwise you’ll end up getting all sorts of seeming random results.

Once you’ve get everything configured and you’ve run a profile import in both farms, then you can set up the first part of this scenario, which is doing cross farm searches by SAML users using Remote SharePoint Index.  I literally used the same exact steps that I described in my previous blog post I referenced at the start of this posting (setting up an oauth trust between farms). Once that trust is set up then it’s just a matter of creating a new result source that is configured as a Remote SharePoint Index, using the Url to the web application in the other farm (the one you used when creating the trust between the farms), and then creating a query rule to execute your query against the remote farm.  When it’s all said and done you end up being able to authenticate into your local farm as a SAML user and search results from the remote farm.  In the screenshot here I’m logged in as a SAML user; you see his full display name because it was brought in when I did the profile sync:

Now, the real beauty of this solution is that we can actually get this to work when using the SharePoint 2013 hybrid search features with Office 365.  In this case I have a farm where I already had the hybrid features configured and working.  When I created my SPTrustedIdentityTokenIssuer, I made sure to include 3 of the 4 claim types (the only ones I populate) that are used in rehydrating users for OAuth calls – email, Windows account, and UPN.  Since those values are also synchronized up to my Office 365 tenant (and thus put into the UPA for my tenant up there), I’m able to rehydrate the user in Office 365 and get search results from there as well, even though I’m logged in locally as a SAML user.  Very cool!!  Here’s a screen shot of the same SAML user getting both local search results as well as hybrid search results from Office 365:

Hopefully this helps to clear up some of the mystery about SAML users and OAuth, as well as how it interacts with some of the built-in features of SharePoint 2013 like Remote SharePoint Index and Hybrid Search.  Clearly the complicating factor in all of this is the UPA and managing your user profile imports.  That would be the first planning item I would be thinking about if you want to use SAML authentication on site, in addition to SharePoint Hybrid.  It’s also clearly a sticking point if you wanted to use both Windows and SAML auth in the same farm with the same users from the same domain.

Setting Up BCS Hybrid Features End to End In SharePoint 2013

For those of you who will be attending SharePoint Conference 2014 (www.sharepointconference.com), I’m going to be teaching a half-day course after the three main days with Bill Baer and some other folks on SharePoint Hybrid features.  This is firming up to be a pretty good session from a presenter standpoint, because we’ll have folks from SharePoint Marketing (Bill), other members of the SharePoint Product Group (Luca, Steve W. and Neil), the Microsoft Technology Center in Irvine (Ali), and our lead / best escalation engineer for SharePoint hybrid (Manas).  That’s a whole lot of hybrid expertise in one location so if you have a chance to attend check out the “SharePoint Server 2013 and Office 365 Hybrid” session in the listings at http://www.sharepointconference.com/content/pre-post-days.

One of the things that I’ll be covering in that session is getting the BCS hybrid features to work.  At a high level, the goal of the BCS hybrid features are to allow you to access your on-premises line of business data from within your Office 365 tenant.  As I’ve started scraping together some content for this topic, I realized that I did not see a really comprehensive “end-to-end” walk through of how to make this work; it’s really a series of documents and blog posts.  So…while getting a sample up and running in the lab I made a list of all the different resources I pulled info from to get things going.  In this post I’m just going to do an outline of what those steps and resources are; if there is need / demand still, then once we’ve delivered on the SharePoint Conference I may try and flesh this out with pictures, examples and more detailed instructions.  For now I’m hoping you can just follow the steps in order, track down the links I provide to pull together the data you need, and use that to build your BCS hybrid applications.

1.  Find / Create an OData data source:  See 2nd half of http://blogs.technet.com/b/speschka/archive/2012/12/06/using-odata-and-ects-in-sharepoint-2013.aspx
2.  Publish your OData endpoint (https):  you need to have an HTTP/HTTPS endpoint from which your OData content is accessible
3.  Create an External Content Type based on your OData data source:  http://msdn.microsoft.com/library/office/jj163967.aspx
4.  Make your ECT file “tenant ready”:  See 1st half of http://blogs.technet.com/b/speschka/archive/2012/12/06/using-odata-and-ects-in-sharepoint-2013.aspx
5.  Create a connection to your on premises services:  Follow the tips described here:  http://blogs.technet.com/b/speschka/archive/2013/04/01/troubleshooting-tips-for-hybrid-bcs-connections-between-office-365-and-sharepoint-2013-on-premises.aspx
6.  Upload your model (.ECT) file to o365 BCS:  Follow the rest of the tips here:  http://blogs.technet.com/b/speschka/archive/2013/04/01/troubleshooting-tips-for-hybrid-bcs-connections-between-office-365-and-sharepoint-2013-on-premises.aspx
7.  Create a new External List and use the BCS model you uploaded:  once your model is uploaded successfully you can create a new External List in o365 and use that to work with your on-premises LOB data.

Let me know how far this gets you, or the trouble spots you might run up against and we’ll see what we can do to address them as time permits.  If you’re coming to the SharePoint Conference hopefully I’ll get a chance to meet up and talk with you there.

Another BCS Hybrid Tip – Fixing “The Type name for the Secure Store provider is not valid” Error

Here’s another error that are actually pretty likely to see if you are setting up BCS hybrid.  When you configure your BCS connection in o365 to connect to your on premises data source, one of the things you need to configure is the security that’s going to be used to make the call into your data source.  In many cases you will want to use a Secure Store Service application and keep a set of credentials in there.  You can configure your connection to do that just fine, but when you try and import your data model in o365 you will get an error message that says “The type name for the secure store provider is not valid”.  If you look into this error further in your on prem ULS logs you will see something to the effect that it’s asking for version of the secure store assembly.  That’s the version that’s running in o365 today, but on premises today you have version

After looking at some different options, we ultimately decided that for now the best way to work around this problem is to add an assembly binding redirect to the web application in the on premises farm.  When I say “the web application”, what I mean is the web application your BCS connection in o365 points at in the connection.  In the connection itself, you will need to have an endpoint for the client.svc, and to do that you use a web application that you have published through your reverse proxy.  So if your web application is at https://www.foo.com, then in BCS you would configure the endpoint as https://www.foo.com/_vti_bin/client.svc

So…in the web.config for that web application you would add an assembly redirect that looks like this:

<dependentAssembly xmlns=”urn:schemas-microsoft-com:asm.v1″>
 <assemblyIdentity name=”Microsoft.Office.SecureStoreService” publicKeyToken=”71e9bce111e9429c” culture=”neutral” />
 <bindingRedirect oldVersion=”″ newVersion=”″ />

I’ll have more details on this and all the rest at the post-SPC training that Bill Baer and I are doing.

Configuring Windows Server 2012 R2 Web Application Proxy for SharePoint 2013 Hybrid Features

This is a topic that’s generated a lot of interest over the last couple of months and I’m happy to report that I was recently able to utilize the new Web Application Proxy (WAP) features of Windows Server 2012 R2 to act as a reverse proxy for the SharePoint 2013 hybrid features.  Specifically I configured it for use with the 2-way hybrid search features between SharePoint on premises and Office 365.  In all likelihood this will be the first level of support Microsoft will offer for using WAP with SharePoint hybrid, and then other hybrid features will likely follow suit afterwards.  For now let’s walk through getting this working in your environment.

To begin with, I’m going to assume that you have all of the other hybrid components in place, and really all we are doing at this point is getting the reverse proxy part of the solution configured.  That means that you have an on premises SharePoint farm, an Office 365 tenant, and dirsync and SSO configured already.  I’m also assuming that you’ve configured a result source with a Remote SharePoint Index and a query rule that uses that result source so you can retrieve search results from either farm.  What we’re going to do here is just add the reverse proxy piece to it so you can do it securely (versus having an open endpoint to your on premises farm laying wide open out on the Internet).

Getting WAP is really a multi-part process:

  • Get Server 2012 R2 and ADFS installed and configured
  • Get Server 2012 R2 and WAP installed and configured
  • Create a WAP application

The first thing to note is that you cannot install both AFDS and WAP on the same server, so you will need at add least two servers to your farm, and of course more if you are adding fault tolerance.  Ironically I found the most difficult thing in this whole process to be configuring ADFS, which was a surprise given how smoothly it went in previous versions.  But new things bring new opportunities, so it’s probably just a matter of getting used to how these things work.  So let’s start with ADFS, and I’ll try and call out the things that would have helped me move along more smoothly.

To begin with, you will go to the Add Roles and Features option in Server Manager to get started.  Once you select your server, check the box for Active Directory Federation Services.  You can complete the wizard to get the service installed, and that part should be completely painless and worry free.  Once the installation is complete you should notice a yellow warning icon in Server Manager, and that’s your indication that you have something to do; in this case, that “something” is configuring ADFS.   Click on the warning icon to see the status information, and then click on the link to configure the federation service on this server.

Since this is our first server we’ll accept the default option to create the first federation server and click the Next button to continue.  On the next page you select an account with AD domain admin permissions and then click Next.  The next page of the wizard is where I ended up messing myself up the first time, so hopefully a little explanation here will help you. 

The first thing you need to do is select the SSL certificate that you will use for connection to ADFS.  One thing worth pointing out right away is that ADFS does not use IIS in Server 2012 R2, so don’t bother looking around for it in the IIS Manager.  This also leads to a potential irritating point of confusion that I’ll explain a bit later.  So select a certificate (typically a wildcard or SAN certificate) that you will use for your SSL connection to ADFS.  If you’re like me, you have a wildcard certificate that you use for these purposes.  If you Import it (or choose it from the drop down), it automatically places the CN (i.e. subject name) of the certificate in the Federation Service Name – this is where you need to be careful. 

Basically what you should put in the Federation Service Name is the Url you want to use for ADFS requests.  So in mine it put “*.vbtoys.com” because that was my certificate’s subject name, and instead I had to put something like “adfs.vbtoys.com”.  Don’t forget to add an entry in DNS for the Federation Service Name.  Finally, the Federation Service Display Name can be whatever you want it to be, then click Next to continue the wizard.

On the next page you select the service account you are going to use, and then click Next.  On the next page of the wizard you’ll have the option of either creating a new Windows database for the service or using an existing SQL Server database.  Make your selection and click the Next button.  The next page is where you can review your selections or view the PowerShell script it will execute to create the service.  Once you’ve taken a look go ahead and click Next to validate all of your selections.  Assuming all the pre-requisite checks pass, you can now click the Configure button to begin configuring the service. 

One final point on this – EVERY single time I’ve ever run this wizard, it has ALWAYS given me a message about an error setting the SPN for the specified service account.  It should be setting a “HOST” SPN for the ADFS service account for the endpoint you defined for the ADFS service.  I believe the net of this is that if you are setting up a single server for your ADFS farm, then when you create the ADFS service you make the service name the same as your server name.  For example, if your server name is “adfs.foo.com”, it appears that at some point in the installation of features that Windows creates an SPN for “host/adfs.foo.com” and assigns it to the computer “adfs.foo.com”.  When you configure ADFS to use the service name “adfs.foo.com” it wants to assign that same SPN of “host/adfs.foo.com” to the ADFS service account.  Now, if you are using multiple servers (so you have a load balanced name for your ADFS endpoint), or if you just use some other name besides the name of the computer for the ADFS endpoint, you should not have this problem.  If for some reason you get turned sideways, you can always open adsiedit.msc and use the servicePrinicpalName attribute on a user or computer to edit the SPNs directly.  In my experience, if you are just using a single server then there really isn’t anything to worry about.

So with that long-winded bit of Kerberos trivia out the way, you should have completed the configuration of your ADFS server.  Now the logical thing to do is to test it to make sure it is working.  This is the potentially irritating point of confusion I mentioned earlier.  If you read http://technet.microsoft.com/en-us/library/dn528854.aspx it provides a couple of Urls that you can try to validate your ADFS installation.  What it doesn’t tell you is that if you try and run any of these tests on the ADFS server itself, they will fail.  So your big takeaway here is find another server to test this on, and then I recommend just trying to download the federation metadata Xml file, for example https://fs.contoso.com/federationmetadata/2007-06/federationmetadata.xml.  The article above explains this in more detail.

Okay, well the good news is that we got through hard part, the rest of this is all downhill from here.  For step 2 you need to install WAP on another Windows Server 2012 R2 machine.  One quick distinction worth making here is that the ADFS server must be joined to your domain; your WAP server on the other hand should not be.  The process is much simpler here – before you get started though, copy over the SSL certificate you used on your ADFS servers to the WAP server; you will use that later when running the WAP wizard.  You can now begin by opening up the Server Manager and go into to add roles and features again.  Once you’ve selected your server, check the Remote Access role then continue with the wizard.  A few steps later it will ask you what Remote Access services you want to install and then check the Web Application Proxy service.  You’ll see a pop up with a few other features that it requires and you can just click the Add Features button to continue.  After you finish the wizard, like before, you’ll see a warning icon in the Server Manager application.  Just like before, click on it then select the option to open the Web Application Proxy wizard.

In the wizard you’ll finally get to enter your Federation Service Name, which is the same as you entered above when you ran the ADFS configuration wizard.  You also provide a set of credentials for a local admin on the ADFS server(s) and then click Next in the wizard.  On the next page you need to select the certificate that you use for SSL to your ADFS server(s).  Unfortunately this step in the wizard does not include a button to import a certificate, so you need to open the Certificates snap-in in MMC and add it to the Local Computer’s Personal certificate store.  If you didn’t do this previously, you’ll need to click the Previous button after you add the certificate, then click the Next button again and your certificate should show up in the drop down list.  Click the Next button in the wizard one last time, then you can review the PowerShell it is going to run.  Click the Configure button to complete the process.

There’s one last gotcha that you might see at this point.  You may get an error that says “AD FS proxy could not be configured” and then some further details about failing to establish a trust relationship for the SSL/TLS secure channel.  Remember that if you are using domain issued certificates, and the WAP server is not joined to the domain, then it does not automatically trust the root certificate authority (i.e the “root CA”) for your SSL certificate.  If that is the case, you need to get a copy of the root CA certificate and add it to the Trusted Root Authority store for your local computer on the WAP server.  You can get the root CA certificate in a variety of ways; I just copied it out of the SSL certificate that I used for the ADFS server.  Once you add that you can click the Previous button in the wizard then click the Configure button again and have it do its thing.  At this point you should be good to go, the wizard completes, and the Remote Access Management Console pops up, with pretty much nothing in it.

As I mentioned before, things get progressively easier as we go.  The final step now is to create a new WAP application.  This just means we’re going to publish an endpoint for our on premises SharePoint farm, and Office 365 is going to send the hybrid search queries to that endpoint, where they will get passed back to our on premises farm.  The genius part of what the WAP folks did is boil all of this down to a single line of PowerShell and then – Kaboom! – you got hybrid. 

I’m going to borrow here from some documentation around the WAP feature that will be published soon to describe the PowerShell that you execute (sorry for stealing here UA team):

Add-WebApplicationProxyApplication -ExternalPreauthentication ClientCertificate -ExternalUrl <external URL> -BackendServerUrl <bridging URL> -name <friendly endpoint name> -ExternalCertificateThumbprint <certificate thumbprint> -ClientCertificatePreauthenticationThumbprint <certificate thumbprint> -DisableTranslateUrlInRequestHeaders:$False -DisableTranslateUrlInResponseHeaders:$False

  • Where <external Url> is the external address for the web application.
  • Where <bridging URL> is the internal or bridging URL you defined for the web application in your on-premises SharePoint Server 2013 farm.
  • Where <friendly endpoint name> is a name you choose to identify the endpoint in Web Application Proxy.
  • Where <certificate thumbprint> is the certificate thumbprint, as a string, of the certificate to use for the address specified by the ExternalUrl parameter. This value should be entered twice, once for the ExternalCertificateThumbprint parameter and again for the ClientCertificatePreauthenticationThumbprint parameter.

I will say that I found a couple of these parameters a little confusing myself when I first looked at them so let me provide an example:

  • I have a web application with a default zone of https://www.sphybridlab.com
  • I use a wildcard certificate for that web application, and that certificate has a thumbprint of 6898d6e24a441e7b73f18ecc9b6a72b742cf4ee0
  • I uploaded that same wildcard certificate to a Secure Store Application in Office 365 to use for client authentication on my hybrid queries

So the PowerShell I use to create my WAP application is this:

Add-WebApplicationProxyApplication -ExternalPreauthentication ClientCertificate -ExternalUrl “https://www.sphybridlab.com&#8221; -BackendServerUrl “https://www.sphybridlab.com&#8221; -name “SphybridLab.Com Hybrid Search” -ExternalCertificateThumbprint “6898d6e24a441e7b73f18ecc9b6a72b742cf4ee0” -ClientCertificatePreauthenticationThumbprint “6898d6e24a441e7b73f18ecc9b6a72b742cf4ee0” -DisableTranslateUrlInRequestHeaders:$False -DisableTranslateUrlInResponseHeaders:$False


That’s it – we’re golden after this.  Here are a few more details about the implementation to clarify things:

  • I have added that same exact wildcard certificate to a Secure Store Service application in my Office 365 tenant; I call that application “HybridCert”.

  • In Office 365 I have a result source that points to my on premises SharePoint farm at https://www.sphybridlab.com.  It is also configured to use the SSO Id of “HybridCert”.


When Office 365 attempts to issue a hybrid query to https://www.sphybridlab.com, it finds the IP address in GoDaddy to be  It submits the query and includes the certificate from the “HybridCert” SSS application as certificate authentication for the request.  WAP is listening on that IP address and requires certificate authentication.  It gets the request, it finds a certificate presented for authentication and that certificate has the thumbprint that we configured it to look for in our WAP application.   It’s all goodness at that point so it passes the request back to https://www.sphybridlab.com, and away we go.  The final results then look like this:

 (NOTE:  This picture keeps disappearing from this post and I have no idea why; you can always try this link directly: https://skydrive.live.com/#cid=96D1F7C6A8655C41&id=96D1F7C6A8655C41%216922&v=3.  It’s just an image of an integrated set of search results from on prem and o365 that you get with hybrid search)

As I mentioned above, look for expanding coverage and support on WAP and SharePoint hybrid features in the coming months. 

Architecture Design Recommendation for SharePoint 2013 Hybrid Search Features

The SharePoint 2013 hybrid capabilities are intended to let users in Office 365 access and search across certain content from an on premises SharePoint farm.  By design, current hybrid features cannot be configured to simultaneously allow users outside a corporate network to access the on premises farm, and to also allow on premises content to be used in Office 365.  To support both scenarios users will need to connect directly to the on premises SharePoint farm through a solution such as DirectAccess or VPN.  That will enable users to access an on premises farm both when they are on a corporate network in addition to outside of that corporate network, as well as use the hybrid capabilities to work with data both from Office 365 and SharePoint on premises.

Using the search hybrid feature of SharePoint 2013 can best be accommodated with a single zone in SharePoint and a split DNS.  The reason I suggest a single zone is so that search results in Office 365 will be rendered using the same Url that users use to access content.  The reason I suggest split DNS is so that users can be redirected to an endpoint that uses standard SharePoint security mechanisms for authentication, but queries from Office 365 can be directed through a reverse proxy configured to use certificate authentication.  The hybrid search feature in SharePoint 2013 supports sending a certificate for authentication as part of the query request.  Here’s a diagram to illustrate:


Using the example diagram above, when the user requests www.contoso.com, they are on corpnet so the internal DNS routes them to the internal address of  That is a load balancing device that sends their request onto one of the SharePoint web front ends where they will be authenticated and can access their content.

They are still on corpnet – either physically or via DirectConnect or VPN as described above – and they browse their Office 365 tenant on contoso.sharepoint.com.  Their Office 365 tenant is configured to use the search hybrid features so when a user executes a query, that query will also be sent to the on premises SharePoint farm.  The request goes from Office 365 to www.contoso.com.  Since Office 365 is using the external DNS though, it resolves that to the address  A reverse proxy device is listening on that address and it requires certificate authentication.  The search hybrid features are designed to respond to requests for certificate authentication, so Office 365 sends the certificate to authenticate the request.  Once the reverse proxy device completes the certificate authentication it forwards the request onto the internal load balancer at, where it gets routed to one of the SharePoint web front ends.  When the search results are rendered they use www.contoso.com as the host name.  When a user clicks on that search result, they are still on corpnet so they again use the internal DNS to resolve that, which will be   When they request the content then, it will be routed to the load balancer and onto SharePoint and the user will be able to retrieve the content.

Troubleshooting Tips for Hybrid BCS Connections Between Office 365 and SharePoint 2013 On Premises

Let me preface this posting by saying a couple of things:

  1. This is not going to be a “how do I create a BCS hybrid connection to my on-premises farm”; there is a whitepaper coming in the next month or so that will be lengthy and loaded with details on the step by step instructions for doing that.
  2. This posting is meant to be used either a) by people who are incredibly adventurous and want to try and create a BCS hybrid connection without any documentation or b) people who have waited until our documentation describing this process is out, they have followed it, but are still having troubles getting it to work.

What I’ve tried to capture in this posting are a number of hurdles and issues I had to overcome in order to get BCS hybrid working between my o365 tenant and my on-premises SharePoint 2013 farm.  As with my other posting on Troubleshooting Tips for High Trust Apps on SharePoint 2013 (http://blogs.technet.com/b/speschka/archive/2012/11/01/more-troubleshooting-tips-for-high-trust-apps-on-sharepoint-2013.aspx), this posting is a snapshot in time, but as we hear of other useful information I will come back and update this post.  So, that being said, here are some issues that we saw when getting things configured and some ideas of how you might be able to work around them:


  1. When you create a Connection Settings object (CSO) in your o365 tenant, you must provide a Url for your on-prem farm (the Internet-facing URL property).  o365 is going to reach out to that endpoint in order to invoke the BCS subsystem and connect to your data source.  Whatever Url you choose to publish and use for this purpose, when you configure it in your CSO you MUST add “/_vti_bin/client.svc” at the end of it in order to work correctly.  If you do not do this then BCS will report an error connecting to the on-premise data source.
  2. If you are using Secure Store Service (SSS) for credentials that will be used to connect to the oData endpoint, you must follow the steps described in this article in order to have it work:  http://social.technet.microsoft.com/wiki/contents/articles/3979.powerpivot-data-refresh-error-credentials-were-not-found-for-the-current-user-within-the-target-application-powerpivotunattended-please-set-the-credentials-for-the-current-user.aspx.   Yes, I know this article pertains to SharePoint 2010 and PowerPivot, but follow these steps or you will have an underlying access denied error when BCS tries to get your SSS application credentials.
  3. Follow the steps I describe here to create the model you will import into BCS in o365:  http://blogs.technet.com/b/speschka/archive/2012/12/06/using-odata-and-ects-in-sharepoint-2013.aspx.
  4. Since your model will be using your Connection Settings object that you create in o365 in order to connect to the on-premise data, there are some changes you need to make to it; if you do not do this then your model will not be able to connect to the on-premise data source. 
    1. To begin with, you should make a copy of the ECT file that you’ll be importing so you don’t break the version you have with your OData project. 
    2. Delete the ODataServiceMetadataUrl and ODataServiceMetadataAuthenticationMode properties from the LobSystem property list in the ECT file.
    3. Delete the ODataServiceUrl and ODataServiceAuthenticationMode properties from the LobSystemInstance property list in the ECT file.
    4. Add this property to the list of properties for both the LobSystem and LobSystemInstance:  <Property Name=”ODataConnectionSettingsId” Type=”System.String”>yourConnectionSettingsObjectName</Property>.  As my sample here implies, the property value must be the name of your Connection Settings object that I described in step 1.
  5. Before you try importing your BCS model into the o365 tenant you need to grant rights to current user to add models first, or you will get an “access denied at 0,0” error when importing the model.
  6. Make sure you grant Everyone at least Execute and Selectable in Client rights to BCS (or whomever you want to be able to connect to these on-premise data sources).  Use the Set Metadata Store Permissions button in the tenant BCS “Manage BCS Models” page.  If you don’t do this, you will get access denied errors for users that have not been granted these rights.
  7. I mentioned the Url you need to configure the CSO in step 1 above.  Any user that is going to be use a BCS hybrid connection must also be granted at least Read rights to the site collection that you use for the Internet-facing URL you configure in your CSO.  Otherwise they will get an access denied error if they try and use the data model; for example, if they try and view an External List based on the External Content Type created when you import the model.


That’s what I have for today, as I mentioned above, when/if we find other useful troubleshooting tips I will update this post. 

Using OData and ECTs in SharePoint 2013

One of the nice enhancements in SharePoint 2013 BCS world is that SharePoint can now consume OData in BDC applications.  There are a couple of gaps I ran across recently though when going through this process so I thought I’d cover them here in case anyone else gets similarly stuck.  To start with, I recommend starting with this document to walk you through the basics of creating an application for OData:  http://msdn.microsoft.com/en-us/library/sharepoint/jj163967.aspx.  The main takeaway here is that you can NOT create a BDC application in SharePoint Designer that connects to an OData source – to do that you need to create an External Content Type (ECT) using a tool like Visual Studio.

The document I linked to above walks you through the process of creating the ECT.  It follows that by showing how to use those ECTs in a SharePoint App and deploying it in that manner, but it does NOT show what you do if you want to add it to the BDC catalog so that it can be used many site collections, and that’s where this post comes in.  The first thing to understand is that when you go through the process described in the article above, it will create one ECT for each entity (like a table).  The reason why that’s important to know is because they will use a shared name in the ECT file, which will prevent you from uploading more than one to the BDC catalog.  In order to use each of these entities in SharePoint here’s what you need to do:

  1. Right-click on the ECT file in Visual Studio and select Open With… then select XML (Text) Editor.  At the top of the document in the Model element you will see a Name attribute.  This value has to be unique between all the ECTs that you upload to the BDC, so you should change each one to a descriptive value for that entity, like “Customers Table”.
  2. You can, but don’t have to, change the Namespace of the Entity element, which is about 20 lines down in the document.  I changed mine to be consistent with the model name, but that’s just a style choice, it’s not required.
  3. Once you’ve made the changes and saved the file, you can upload the .ect file directly to the BDC.  Just use the default options – it’s a model – then click the OK button and you’re good to go.
  4. Once you’ve imported the models, don’t forget to grant permissions to folks to use them; kind of pointless without that.

One final thing worth noting here – out of the box you don’t get OData metadata endpoints over things like SQL databases, Azure Table Storage, etc.  Adding it for SQL is fortunately relatively easy.  In a nutshell you:

  1. Create a new Empty ASP.NET web application
  2. Add an ADO.NET Entity Data Model
  3. Add a WCF Data Service
  4. In your WCF Data Service you need to set the Type in the class constructor; this may be a litle confusing at first.  What you want to do is look for a file (that should be in the App_Code folder) that is named something like myDataConnection.Context.tt.  IMPORTANT:  There are usually TWO “.tt” files; find the “blah.Context.tt” file or you will end up using the wrong thing!  If you expand that, underneath you should see a myDataConnection.Context.cs class.  If you open that up you will see the two pieces of information you need for your WCF Data Service:  1) the class name, which you will use as the Type for the WCF Data Service class constructor.  2) The names of the entities it is supporting, implemented like get; set; properties.  You will need the entity names in the WCF Data Service as well, because at a minimum you need to create “SetEntitySetAccessRules” for each entity you want to expose.  This is explained in more detail in the comments when you add a WCF Data Service – I’m just trying to tell you where you go find the entity name to use when you create one of those rules.


Hybrid Features In SharePoint 2013 and Office 365

For those of you that haven’t heard yet, we have some great new features in SharePoint 2013 that enable hybrid environments between on-premises and o365 farms.  These features allow you to use your BCS applications you have on premise in your o365 farm, and also to be able to see search results from one farm while in the other – i.e. if you are in your on-premise farm when you run a query you can see results from both your on-premise farm and o365 in one consolidated view.  If you’re lucky enough to be at the SharePoint Conference (SPC) this week you can see Brad Stevenson present on this topic with session SPC 243 – Hybrid Overview: Connecting SharePoint 2013 On-premises to SharePoint Online in Office365.  Unfortunately I was going to be presenting this topic along with Brad but I got run over by one locomotive of a respiratory infection so I’m stuck just writing about it for now.

There’s a lot of configuration and setup required to get this working, but we have a couple of resources to help you.  First, we just published a set of whitepapers and documentation on getting this set up and going.  You can visit the download center at http://technet.microsoft.com/en-us/library/jj838715.aspx to get an overview of the resources we have.  If you follow the link on the bottom of the page to http://www.microsoft.com/en-us/download/details.aspx?id=35593 then you can download the papers.  To set up a hybrid search environment you will want to download the sps-2013-config-one-way-hybrid-environment document or PDF.

If you are just interested in seeing a demo of how it works, here are some links to what Brad is going to show at SPC:






More SharePoint 2013 Hybrid Search Tips

As more folks are deploying the SharePoint 2013 Hybrid features we continue to pick up little tidbits that help make the journey easier.  A couple of new ones have come up recently that are worth sharing at this point, so here goes:

  1. Don’t include a -StartDate and -EndDate when creating a new Msol-ServicePrincipalCredential.  At one time there was a problem that could occur without these being set, but it has since been resolved.  To simplify things all around it’s just easier and removes another possibility of mistake by excluding them altogether.  If you get the dates wrong you will find that when you try and execute a query against o365 from your on prem farm you will get an error along the lines of invalid JWT token (you’ll have to dig into the ULS logs to get this level of detail on the issue).
  2. PLEASE be careful when you create your Search Service Application that you do NOT create it in partitioned mode.  If you created it in partitioned mode then hybrid search will not work; again you will get an invalid JWT token.  There must be some PowerShell script floating around the interwebs that creates in partitioned mode because we’ve seen a few of these cases come up recently, which is unusual because most customers don’t knowingly create their service applications in partitioned mode.