System.ServiceModel.FaultException in IIS

Had a very strange error the other day when migrating a project from using the Visual Studio IIS Express to the local IIS.

The project was a WebAPI that interfaced down to a Microsoft Dynamics CRM 2015 instance on the same box and had been working well within the Visual Studio environment.  However, as soon as the project was deployed to the local IIS instance the connection to the CRM was failing with a rather obscure System.ServiceModel.FaultException.  Unfortunately, there was little else in the exception details to go on and an initial Google search yielded little help aswell.

After many different tests to try and isolate the issue I finally found the reason for the failure which was the authentication was failing.  Again, after a few more tests I still couldn’t get the authentication to work, but eventually came across one guys answer to a similar problem on Stack Exchange (http://stackoverflow.com/questions/5981167/error-message-the-request-for-security-token-could-not-be-satisfied-be).  Scroll down to the first answer (Sixto) and he mentions that you need to change the AppPool identity!  Voila!  The default Identity setting is ApplicationPoolIdentity which is great if you are simply reading and running the code in ASP.NET pages, however this won’t allow access to databases etc… and you need to change the identity to a user that has been granted access to whatever resource your webpage is trying to gain access too.

Here is a page with some more details on the different web servers Visual Studio can use along with the advantages and disadvantages of each.

https://msdn.microsoft.com/en-us/library/58wxa9w5%28v=vs.120%29.aspx?f=255&MSPPError=-2147217396#iisexpressdisadvantages

 

 

 

Ajax comes to EmigratingToOz.com

The goal of the EmigratingToOz.com website is to be the oracle of information about Australia for us UK poms and to help them in the emigrating process. To that end, one of the things we needed to do was also supply real-time information about Australia from the point of view of weather, money and time.

Initially, the website was set up to actively get this information from a variety of sources and simply display on the page via the master page. However, the weather information in particular, comes from a slowish website that was killing the inital load time of the site.

The solution was simply to implement Ajax calls to get the information required in an asynchronous manner. Using the generic library that we created to demostrate Ajax calls on our main website, we were able to implement the calls for all of the data in one quick afternoon. The result is a quicker and more pleasing experience.

New CodeConsults website – Emigrating To Oz

We’ve recently started a new project website called Emigrating To Oz

Its a site centering around providing a resource of information for people wishing to emigrate to Australia, but its still in its infancy and we are targetting a Q2 2008 release with all the bells and whistles, but in the meantime the site is live so that we can begin getting it indexed and tested.

There’s plenty of work to do and content to add and we’re hoping that in time it will become an excellent resource for us Poms who are looking at possible life Down Under.

Auto-generating a sitemap conforming to Sitemap Protocol 0.9

Having recently installed BlogEngine.NET, I was interested to see the auto-generated sitemap that Mads had created. Basically, he implemented a custom HttpHandler that when requested would respone with XML that conformed to the Sitemap Protocol 0.9 which is dictated by sitemaps.org. This is an XML schema that all of the major search engines have signed up to which include Google, Yahoo! and Microsoft.

What I wanted to do is to implement this for my own site, but the custom handlers of Mads was tailored to BlogEngine.NET, so it required a bit of tweaking. Now, I have a Web.sitemap for my website, so in the scheme of wanting to do things from a generic point of view, I decided to implement the HttpHandler to utilise the Web.sitemap file to generate the sitemap response. To do this, first we have to load up the sitemap file…

XmlSiteMapProvider xmlSiteMap = new XmlSiteMapProvider();
NameValueCollection myCollection = new NameValueCollection(1);
myCollection.Add(“siteMapFile”, “Web.sitemap”);
xmlSiteMap.Initialize(“provider”, myCollection);
xmlSiteMap.BuildSiteMap();

Then we have to navigate through the tree structure identifying the nodes that we are interested in…

private static void ProcessNode(XmlWriter writer, SiteMapNode node, string attribute)
{
foreach (SiteMapNode siteMapNode in node.ChildNodes)
{
string actualUrl;
if (siteMapNode.HasChildNodes)
{
ProcessNode(writer, siteMapNode, attribute);
}
actualUrl = siteMapNode[attribute];
if (string.IsNullOrEmpty(actualUrl))
{
actualUrl = siteMapNode.Url;
if (string.IsNullOrEmpty(actualUrl))
{
continue;
}
}
WriteUrl(actualUrl, writer);
}
}

And lastly building up the XML response…

if(Uri.IsWellFormedUriString(actualUrl, UriKind.Relative))
{
FileInfo fileInfo = new FileInfo(HostingEnvironment.MapPath(actualUrl));
writer.WriteStartElement(“url”);
writer.WriteElementString(“loc”, HttpContext.Current.Request.Url.AbsoluteUri.Replace(HttpContext.Current.Request.Url.AbsolutePath, “”) + actualUrl);
writer.WriteElementString(“lastmod”, fileInfo.LastWriteTime.ToString(“yyyy-MM-dd”));
writer.WriteElementString(“changefreq”, “monthly”);
writer.WriteEndElement();
}

I have implemented this HttpHandler in a seperate component, simply to allow me to reference this utility from other websites that I have written, but you may decide to simply plug in the HttpHandler to your main project….the decision is yours.

Now, hopefully you’ve also realised that you’ll need to reference the HttpHandler in the website’s web.config……like so…

<httpHandlers>
<add verb=”*” path=”sitemap.axd” type=”CodeConsults.HttpHandlers.Generic.Sitemap” validate=”false”/>
</httpHandlers>

Now you can run your website, point your favoured browser to http://website/sitemap.axd and voila, your sitemap is available for all to see. But the last stage should be to update your robots.txt file to tell the search engines that you have a nice sitemap for them to use. To do this simply open your robots.txt file and enter the following…

sitemap: http://www.codeconsults.com/sitemap.axd

Alternative to Server.MapPath…use HostingEnvironment.MapPath

A call to the following context sensitive MapPath function does exactly what the documentation says it does.

System.Web.HttpContext.Current.Server.MapPath(“\filename.txt”);

It returns the physical file path that corresponds to the specified virtual path on the Web server. But this isn’t always what you want as it works on the current context which may be a web form running in a sub-directory of the main website.

In this case the solution is simple. Use the following code to map the path to a physical path on the server

System.Web.Hosting.HostingEnvironment.MapPath(“\filename.txt”);