Lotus Notes to SharePoint 2010 Migration

Recently I was engaged in a large scale Lotus Notes to SharePoint 2010 Migration project for a client in the travel/aviation industry vertical . Lotus Notes is a multi-user collaborative client-server application environment competing with Microsoft SharePoint 2010 suite of products /Microsoft Outlook/ Office workspace 2010. There are numerous legacy Lotus-Notes implementations which sometimes can be a overhead to main overtime and scale. This post summarize the approach taken to migrate such a application into SharePoint 2010.

The legacy Notes Applications in the context was designed and developed few years back was not quite helping accelerating the business outcomes for the client. Therefore migrating those applications onto a  single platform which can centrally manage and maintain over time was timely.

 Notes Architecture

The Domino product is an integrated messaging and Web application software platform. It provides a robust environment for creating Web applications that support workflow, RDBMS connectivity, and collaboration. Domino is ideal for creating applications that allow "sometimes connected", multiple users to cooperatively process unstructured information.The term Lotus Notes is now used to refer to the underlying client-server architecture that the product is built upon.

Storage Model

The storage model of Lotus notes storage area a "Note" (These Notes will also be referred to as documents, a term more familiar to users). In each Note we have "items"; and each item has its own "item name". ( and an item may be a list of values) .This iterative structure of one element containing other elements has been referred to as the Notes container model. The "Notes Storage Facility" and the file that creates on the Notes Client is given an extension of .NSF.The Notes data model contains no concept of a "record type". All documents in a database are equivalent as far as Notes in concerned.

User Interface

Forms is the mechanism that Notes Client (Desktop Client) use to show the user what is in a document.The areas on the form that display the values stored in the items on a document will be called fields.This listing of item values from a set of documents in a column format is a Notes View.


You can program Notes to perform tasks automatically using agents (also known as macros). Agents can help you perform repetitive tasks, such as managing documents and sending memos. Agents can complete almost any action you can do manually in your databases.When you create an agent, you can set options to specify:

  • When the agent should run. (ie. a document has been changed)
  • Which documents in the database the agent should run on. You can target all or any subset of documents in the database for the agent to run on.
  • What actions the agent should complete. You can choose actions from a list or use Notes formulas or programming scripts and languages to create actions.

Tools for the Job

  • Visio 2010 Premium – Modelling new the process workflows with business analysts/domain experts
  • SharePoint Designer 2010 – Authoring the workflows/ customising list forms etc.
  • Visual Studio 2010 – Create Solution Artefacts/ Custom event handlers/ create features, packaging sandbox .wsp solutions
  • SPServices jQuery library for SharePoint web services – UI extensibility helpers / Cascaded dropdowns etc.
  • Bit Bucket  with Mercurial/ VisualHG -Distributed Version Control (DVCS)
  • Quest Notes Migrator for SharePoint – For Data Mapping and Migration from Notes fields to SharePoint Lists.

Migration Map

Each Notes Application was mapped to Corresponding SharePoint 2010 Custom Application in its own Site Collection with its own Logical Security boundaries.



The new applications were architected and implemented leveraging the new platform capabilities of SharePoint 2010 and mapped the required data into the new application platform. The Quest Notes migrator did a good job for us by nicely abstracting Lotus notes internals by allowing us to defined Extract-Transform-Load (ETL) operations including user mappings from Domino to AD accounts. Where Transformations were complex Extract-Load-Transform (ELT) approach was used with custom transformation rules with PowerShell / C#

High level process followed for migrating each Notes application to SharePoint 2010 platform involved following steps

  1. Consult Notes application owners (product owners) in each department and created a prioritised product backlog
  2. Modelled the new workflows /rules with Visio 2010 premium with business users
  3. Built & Prototyped the new application with the best tool fit for the job. (Visual Studio 2010   -  Author and deploy base level custom site columns/content types/list definitions/list instances/custom event handlers , SharePoint Designer 2010 – forward engineer the Visio workflows / XSLT List form customization )
  4. Data mapping jobs were created/tested with Quest Notes migrator for SharePoint in the dev environment with a subset of production data
  5. Repeated steps 3 & 4 with product continues reviews/feedback
  6. Reverse engineer reusable workflows into features and list form  into solution artefacts into SharePoint Solution packages and deploy into production environment as SharePoint features
  7. Execute the Data migration job on the production environment

Hope this post would benefit those who interested in migrating legacy lotus notes applications to SharePoint 2010.

Improving the quality of your Custom SharePoint 2010 Solutions via continues Integration(CI) with Team foundation Server 2010

This is a follow up blog post based on the content of my presentation on “SharePoint2010-ALM” presented at Melbourne SharePoint User Group meeting (#MOSSIG) on 23rd March 2011. Since then I’ve received number of requests from the community for sharing the content and the references. Therefore I thought of writing a little detailed post about this topic and here we go.

Continues integration is a critical process in modern day product development in improving the quality of your solution from 1st Sprint (iteration) of your product development if you are following an agile approach like Scrum.  This allows you to early detect and rectify the issues in your solution which increases the chances of success without having to go through painful integration issues and surprises at the last minute in your product development cycle.

How does it improve the quality of my custom SharePoint solution?

Well if you have a mature fully automated continues build (CI build) and a Nightly build you are in the correct path and your SharePoint development team enjoys the following in every sprint.

  1. It enforces you to promote the code from Dev -> Test-> Staging -> Production in the correct way in the first place. Otherwise automation will not be possible.
  2. Every code check in by a developer will kick-off an automated code validation, run unit tests (if any) and build the SharePoint solution (.wsp files) in the Team foundation server. 
  3. A Nightly build will do every thing in step 2 plus Add, Deploy and Activate your SharePoint solution packages into your test farm and can run and automated coded UI tests and automated load tests (if any) for you.

here is the sketch workflow.


Why was this approach not so common in SharePoint Development?

The Lack of SharePoint development tooling support with VS2008 contributed significantly. This is also coupled with the natural complexity arises when you develop on a platform rather than a development Framework. When it comes to SharePoint development, you may not necessarily develop with Visual Studio environment. You can develop quite powerful collaborative solutions and branding work with SharePoint Designer as no code solutions. But unless we convert those customisations into solution artefacts we will not be able to cleanly source control and maintain the solutions which breaks the fundamental concept of Application Lifecycle Management (ALM).

The Challenge

SharePoint 2010 has evolved and positioned as a single unified platform for most originations playing a mission critical role for day to day operations. With the introduction of Business connectivity Service (BCS), Client OM, Sandbox solutions, WCF based Service Applications and Claims based authentication SharePoint 2010 has also significantly evolved as application development platform which opens up a number of extensibility options for the developers.

We also have integrated set of tools built into Visual Studio 2010 for SharePoint Development. So the challenge is how to use these tools effectively and efficiently in a team based environment to build your product or solution and deploy into those shared mission critical environments confidently with consistent quality? This post describes how to implement an automated continues build with the help of following tools to validate, build and deploy SharePoint solutions.

Step1: We need to get all your customisations as solution artefacts. Reverse engineer your SharePoint designer customisations and featurize them. CKSDev Branding SPI can help you to deploy master pages and CSS as Features. Save all SharePoint Designer authored workflows as Reusable workflows as .wsp from SharePoint Designer and bring it to Visual Studio environment.


Step2: Create a new Team project and add your solution into source control under TFS. For the purpose of this post I am using a simple VS2010 solution name SPHelloWorld. For Code analysis, SPDisposeCheck ruleset needs to be in source control and can be shared across multiple projects with in your repository. 


Step3: Enable Code analysis for you solution via project poperties and select SPDisposeCheck RuleSet from your local TFS workspace as shown below.


Step4: Note that Code Analysis now runs SPDisposeCheck rules as part of compilation and both Dispose and Do not Dispose rules are fired when detecting the the poor code.



Step5: Now open the Default Build Template for the project. We are going to extend the default build definition file to deploy our solution to the Test SharePoint Farm remotely. TFS 2010 is using windows workflow foundation 4 based

.xaml template.  Drag the activities from the left side bar tool box and assemble the activities as shown below just after packaging is finished.

The high level tasks performed by this segment of the workflow are

  1. Build the Solution
  2. Run Code Analysis with SPDisposeCheck on the TFS Server. If this code analysis fails our build would fail at this point.
  3. Create .wsp solution package
  4. Check the user specified deployment flag (This is a work initiation parameter)
  5. Write a log message to the log file
  6. Map a Admin share drive to the Target SharePoint Server (Test Farm) with the user specified credentials.
  7. SET the STSADM path on the target server
  8. Stop V4 Admin service in the target farm. (This uses PSEXEC to remotely execute STSADM on the target server)
  9. Copy each .wsp file to the target SharePoint  server and add, deploy and execute the timer job in user specified sequence.
  10. Finally turn on the V4 Admin timer service.


If you are interested the customised TFS2010 workflow template for building and deploying .wsp solutions can be downloaded from my SkyDrive here

Now define the additional parameters required for the workflow initiation that needs to be supplied by the user when starting a new Team build.


Now set activity properties for each activiry. As a sample I’ve shown below the properties set for the  “Add solution” InvokeProcess workflow activity. Once all properties are set WF4 validates your workflow. Once all validations are passed, checkin your custom build definition template into TFS.



Step6: Author a new Build definition based on our updated workflow template.


Some important parameters that you need to set are shown below.



Step7: It’s time to Queue a new Build. Now assume a scenario where the developer rushes to checkin by overriding the Check-in Policies set by the build master.


and here is the result : TFS build agent decided to fail your build because your code did not pass the code analysis. No .wsp’s produced.


So developer now fix the code with proper disposing as shown below and check-in the code again.



And after few seconds we get this TFS Build status. All green!


Opening the build log gives us the workflow execution log as shown below. Note how STSADM has been invoked remotely via psexec.exe on the TFS build agent.



and see below the end result in the central admin in our test SharePoint Farm.


On a final note “the right tools make all the difference”. But tooling it self may not helps you to build a quality product. The process matters. TFS 2010 support you with your chosen process and allows you to tailor the default process templates if required. You have the chance to select the process template when you create a new Team project as shown below. I highly encourage you to check out the new “Microsoft Visual Studio Scrum 1.0” template. This template has been co-developed by Microsoft in collaboration with the Scrum community. The template guides you with scrum based agile product development. I’ve highlighted a few below.

  • Creating effective user stories
  • Creating prioritizing grooming product backlog
  • Agile estimation and Sprint planning
  • Scrum artefacts such as sprint burndown and release burndown chart


And here is the high level  process behind the template.



The correct tools coupled with the correct process will take your product/solution to the next level faster and smoother and make you a successful SharePoint developer and hence a successful product/solution delivery Team.

Where to go from here?


Consuming Windows Azure DataMarket with SharePoint 2010 via Business Connectivity Services

Windows Azure MarketPlace DataMarket is a service that provides a single consistent marketplace and delivery channel for high quality information as cloud services. It provides high quality Statistical and Research data across many business domains from financial to Health Care.. You can read more about it here

How do we consume this content in an enterprise scenario? In this post we are going to look at how to integrate Azure Marketplace Datamarket with SharePoint 2010 via Business Connectivity Services via .NET Connector as a external list. This enable us to surface Azure DataMarket data through SharePoint 2010 lists which opens up world of possibilities including going offline with SharePoint workspace and participation with workflows, enterprise search etc..

Let’s first start with Azure Datamarket. Logon to https://datamarket.azure.com with your live id and subscribe to a dataset that that you are interested in. I selected European Greenhouse Gas Emissions DataSet. Also note down your account key in the Account Keys tab as we will need this information later on.


The Dataset details give you the input parameters and results as follows.


You can navigate to the Data set link with your browser and view the oData feed. We are going to consume this oData feed with WCF Data Services. This allows us to use Linq to filter the data.

LinqPad is a nifty tool to develop and test LINQ Query Expressions. It allows you to connect to Windows Azure DataMarket Datasets and develop the queries against the data as shown below.


Now lets move on to VS2010 and create a BCS Model project as shown below.



Rename the Default Entity in the BDC Model designer and set the properties that you want to expose as the external content type


Once all properties are set DBC Explorer displays my model as shown below.


Now our BDC model is ready. Now lets implement ReadList and ReadItem BDC operations.

First Add a Service Reference to your project. Set the Address to the location of your Azure DataMarket

Dataset and Discover the oData Feed as shown below.


After the service reference is added , we can implement the ReadItem method as shown below.



Note that Service Reference Context requires your live ID and account Id pass as credentials in to the service call.


Once all operations are implemented Click F5 to build and Deploy your BDC model  into SharePoint. Once deployed

log on to Central Administration and go to Manage Service Applications and select Business Connectivity Services

and view the Details of EmissionStat Content Type being Deployed. You might wanted to permissions for this content

type at this point.


Now the external content type is defined all we need is a external List based on this content type. Fire up SharePoint

Designer and Create an external list referencing the new External content type just created


Once the list is created navigate to the list in the portal. Note that Windows Azure market place Data Market content

now surfaced through SharePoint 2010 list as shown below. The VS2010 solution associated with this post can be download here


Viewing item details shows full item details using our ReadItem BCS operation.


Integrating Structured data (Azure) with Unstructured content (Collaborative Data) in SharePoint allows you  to create high business valued

composites for informed decision making purposes at all levels in your organisations. Happy Cloud Collaboration!

SharePoint 2010 Live Image Capturing WebPart packaged as a Sandbox solution

It was great speaking and networking with fellow members at MOSSIG July meeting in Melbourne last week. I presented a session on Effective use of SilverLight 4 with SharePoint 2010.  The Slide deck is uploaded here. I demoed a SilverLight 4 WebPart  taking a snapshot on demand and with the help of  client object model  automatically save the captured snapshot to SharePoint picture library.

This demo solution was created to showcase effective use of SilverLight 4  in SharePoint 2010 as well as to show how MVVM (Model-View-ViewModel)  pattern pattern is used in designed and developing maintainable and testable SilverLight applications with powerful declarative Data binding capabilities

I thought of writing a little elaborative blog post about this solution and here we go…

First lets have a look at the outcome.

Final Result:


By the way this is my little boy Bryan. He enjoyed this a lot  🙂

The picture library to be used for storing the captured images can be configured by the user as shown below as a WebPart property.



Solution Type:

This  is sandbox solution. Therefore as site collection administrator you can add this solution to your solution gallery and try out its functionality with minimal impact on your farm as compared to a classic farm solution. SharePoint perform resource monitoring of your sand box solutions  as shown below. If it doesn’t behave properly it will be kicked out by SharePoint before you even know about it at times . How cool is that ?

This is a great model for ISV’s to distribute their apps and Farm/Site Collection Administrators to trial out third-party solutions with minimum risk.



Solution architecture:

The basic solution archtecture is shown below.  The viewmodel is decorating the model/Repository  and sits between your model(domain objects) and the view.  MVVM is targeted at modern UI development platforms (WPF and Silverlight). You can read more about this simple pattern  here


The VS2010 solution structure for this application is as follows.


SandBox WebPart properties:

One of the limitation with running in the sandbox is that all web parts must inherit from “System.Web.UI.WebControls.WebParts.WebPart” instead of “Microsoft.SharePoint.WebPartPages” (which is not allowed)  when running in the sandbox.  This means the traditional means of creating custom Web Part Properties using the ToolPart object in SharePoint is not available.  Thus one must now rely on how these custom properties are created when dealing with ASP.Net Web parts.  Which by the way (as is obvious in the sandbox restriction) is now the preferred way to create web parts.  ASP.Net Web Parts have an equivalent class to the “ToolPart” class called the “EditorPart”. So I’ve created my webpart property editor inheriting from EditorPart as shown below.

// Create a custom EditorPart to edit the WebPart control.
     Level = AspNetHostingPermissionLevel.Minimal)]
 public class SLWebCamEditorPart : EditorPart
     private DropDownList _partTargetLibrary;

     public SLWebCamEditorPart()
         Title = "Image capture webpart settings";

     private DropDownList PartTargetLibrary
             return _partTargetLibrary;

Passing initialize parameters into Silverlight from Sharepoint:
The CreateChildControls Method in the hosting webpart injects SilverLight host control  into the WebPart as shown below.  In this case we are injecting the current site URL and target picture library name as ‘initparams’ into the SilverLight host control.
NOTE: It is important to inject the current site URL in order to guarantee the ClientContext (in SharePoint client OM ) is not NULL during  runtime execution.
[Personalizable, WebBrowsable(false)]
public String PictureLibraryName { get; set; }

protected override void CreateChildControls() { var width = "400"; var height = "600"; base.CreateChildControls(); string source = SPContext.Current.Site.Url + "/_catalogs/masterpage/SLWebCam.xap"; Controls.Add(new LiteralControl( "<div>" + " <object data=\"data:application/x-silverlight-2,\" type=\"application/x-silverlight-2\" width=\"" + width + "\" height=\"" + height + "\">" + " <param name=\"source\" value=\"" + source + "\"/>" + " <param name=\"onerror\" value=\"onSilverlightError\" />" + " <param name=\"background\" value=\"white\" />" + " <param name=\"minRuntimeVersion\" value=\"4.0.50401.0\" />" + " <param name=\"autoUpgrade\" value=\"true\" />" + " <param name=\"initParams\" value=\"MS.SP.url=" + SPContext.Current.Site.Url + ",PictureLibraryName=" + PictureLibraryName +     


The injected initparam  values in the serverside can be extracted from the client side and stored as application resource as shown below.

private void Application_Startup(object sender, StartupEventArgs e)
{   // Extract init params and inject into application resource dictionary
    foreach (var resource in e.InitParams)
    this.RootVisual = new MainPage();

Declarative DataBinding:

The XAML based view was created using VS2010 enhanced UI designer. The ViewModel exposes the necessary bindable properties and bindable commands (using command pattern). In other words ViewModel is the datacontext for the View

ie: XAML markup for showing the live web cam video uses a Rectangle bound to a property named VideoBrush in the ViewModel as shown below.

<Rectangle Fill="{Binding VideoBrush}" Height="138" Width="170" />

private VideoBrush videoBrush;
public VideoBrush VideoBrush
    get { return videoBrush; }
        videoBrush = value;

The view Model implements INotifyPropertyChange interface. Notifies clients that a property value has changed. The snapshots taken are exposed as a ObservableCollection of Writable bitmaps from the ViewModel and bound to a ListBox with  a DataTemplate as shown below. ObservableCollection  Represents a dynamic data collection that provides notifications when items get added, removed, or when the whole list is refreshed.

public ObservableCollection<WriteableBitmap> Images { get; set; }
   public ICommand StartCaptureCommand { get; set; }
   public ICommand StopCaptureCommand { get; set; }
   public ICommand TakeSnapshotCommand { get; set; }
   public ICommand SaveSnapshotCommand { get; set; }

<ListBox x:Name="Snapshots" ItemTemplate="{StaticResource SnapshotDataTemplate}" ItemsPanel="{StaticResource SnapshotItemsPanelTemplate}" ItemsSource="{Binding Images}" SelectedItem="{Binding SelectedSnapshot, Mode=TwoWay}" Height="123"

Width="356" />

Saving the Captured Image to the Sharepoint picture Library:
In order to use sharepoint client OM all you need is referencing the following 2 assemblies from SP root folder. Web Server Extensions\14\TEMPLATE\LAYOUTS\ClientBin  and use the following namespace from silveright.

using Microsoft.SharePoint.Client;
I am also using SilverLight Image manipulation library “ImageTools” from   http://imagetools.codeplex.com/ for handing bitmap processing in the client side.

When the user select a snapshot and click the save button ClientContext is initiated and List information is obtained from the server via Client OM as a Asynchronous operation.  On the successful callback FileCreationInformation object is created with the image content , add the new file  to the list of files and commit is performed as a another asynchronous call to the server.
/// <summary>
 /// Obtain list instance with required minimum information via Client OM
 /// </summary>
 private void GetSpTargetListInfo(string listname)
     //first build the workload 
     _targetlist = _ctx.Web.Lists.GetByTitle(listname);
     _ctx.Load(_targetlist, l => l.RootFolder.ServerRelativeUrl);
     //now execute it as a on single batch  
     _ctx.ExecuteQueryAsync(ListloadSucceeded, ListloadFailed);

private void ListloadSucceeded(object sender, ClientRequestSucceededEventArgs e) { if (_img == null) return; using (Stream fs = _img.ToStream()) { var fileContent = new byte[fs.Length]; fs.Read(fileContent, 0, fileContent.Length); //create file object var file = new FileCreationInformation { Content = fileContent, Url = ApplicationContext.Current.Url + 

_targetlist.RootFolder.ServerRelativeUrl + "/" + Guid.NewGuid() + ".png" }; //add the new file to the target list var files = _targetlist.RootFolder.Files; files.Add(file); //Now execute the workload _ctx.ExecuteQueryAsync(UploadRequestSucceeded, UploadRequestFailed); } }


If you want to play with the webcam webpart a packaged Sandbox solution (SpImageCapture.wsp) can be downloaded here.  When activating this solution Silverlight .XAP file named SLWebCam.xap will be created in the site masterpage gallery and a hosting WebPart will be added to your WebPart gallery named SLWebCamWebPart.webpart  Drag this WebPart to your page and configure its web part properties by specifying a picture library to use for storing the images and you are done.   I’ve also uploaded the  full VS2010 solution here if you are interested in the code.


Getting Started with Silverlight and SharePoint 2010 at #DDDMelbourne

I was honoured to be speaking at DDDMelbourne developer conference last weekend in Melbourne. Thanks you every one who voted, attended and commented on my session “Getting Started with Silverlight and SharePoint 2010”. With SharePoint 2010 and Visual Studio 2010 the developer story has come a long way. SharePoint now is integrated into Visual Studio 2010 as a first-class platform Silverlight 4.0 provides the opportunity for developers to create the next generation of Rich Internet Applications (RIAs). SharePoint 2010 integrates closely with Silverlight to enable you to build compelling user interfaces that interact with SharePoint for deep data integration with line-of-business systems. This session will covers the techniques, data access strategies and developer productivity enhancements and for rapidly building and deploying Silverlight 4 applications on SharePoint 2010 platform including SharePoint Client Object Model, RESTful services along with Sandboxed Solutions.

The Slide deck is upload here

Develop a Sandboxed Silverlight 3.0 Web part for SharePoint 2010

I recently had a chance to play around with the Silverlight Bing map control  v1 SDK and thought of integrating the map functionality with SharePoint  as a web part. In this post I am going to through the steps I’ve taken to accomplish this task. Basically we are going to develop a Silverlight 3.0 web part  as a sandbox solution for SharePoint 2010. Our web part will be interacting with SharePoint 20101 new Client object model to access data stored in a custom list to render the map with locations as pin points. The end functionality will looks as below. Out custom SharePoint list stores the locations with their Latitude and Longitudes.


You need to have SharePoint 2010 dev box, Visual Studio 2010 and Expression Blend 3, Silverlight Bing map control library installed for this task. I have outlined the steps required to setup a development environment for SharePoint 2010 in my previous blog post.

Let’s get started.

Step1: Create a new Silverlight 3.0 application in Visual Studio 2010


Step2: Choose to host your SilverLight application in a ASP.NET Web Application project.


Step3: Add the SharePoint 2010 reference DLL’s to interact with Client object model  into the Silverlight project that you have created. The two dll’s that you need to reference are located in the folder 14\TEMPLATE\LAYOUTS\ClientBin


Step 4: Download and install  Bing Maps Silverlight Control SDK. The interactive live SDK can be found here. You will also need a developer key to assess Bing maps which can be obtain from here.  Once the SDK is installed reference the Bing map dll’s (2) in your silverlight project. by default the those dll’s are present in C:\Program Files (x86)\Bing Maps Silverlight Control\V1

Once all necessary dll’s references are added our Silverlight project looks as below.


Step 5: Now its time to design the user interface with XAML for our web part. Import the xml namespace for Bing map control in the in MainPage.xaml file.


once the above name space is imported all you need is  <m:Map/> to place the map control in your user interface. I’ve used expression Blend 3 to author XAML and output looks as below. Basically I got a simple Grid layout with TextBlock and Map control with the default map location points to central Australia with “Road” mode and  with zoom level 4.3. You can set these values as per your requirements. The CredentialsProvider key is what you get when you create a developer account for Bing map control.

<UserControl x:Class="SLFactoryMap.MainPage"
        <Style x:Key="HeaderStyle" TargetType="TextBlock">
            <Setter Property="FontWeight" Value="Bold"/>
            <Setter Property="FontStyle" Value="Normal"/>
            <Setter Property="FontSize" Value="13.333"/>
            <Setter Property="Margin" Value="0,0,0,0"/>
            <Setter Property="VerticalAlignment" Value="Top"/>
            <Setter Property="HorizontalAlignment" Value="Center"/>
            <Setter Property="Foreground">
                    <LinearGradientBrush EndPoint="0.5,1" StartPoint="0.5,0">
                        <GradientStop Color="Black" Offset="0"/>
                        <GradientStop Color="#FF725E5E" Offset="1"/>
            <Setter Property="Margin" Value="5,0,0,0"/>

    <Grid x:Name="LayoutRoot" Background="#FF64D3DC" Width="600" Height="490">
            <ColumnDefinition Width="*"/>
            <RowDefinition Height="0.04*"/>
            <RowDefinition Height="0.92*"/>
        <TextBlock Height="20" Style="{StaticResource HeaderStyle}" Grid.Row="0" Grid.Column="0"  d:LayoutOverrides="Height" Grid.ColumnSpan="2"> Production Sites </TextBlock>
        <m:Map x:Name="factorymap" Grid.Row="1" Grid.Column="0" CredentialsProvider="{your key}" Mode="Road" ZoomLevel="4.3" Center="-25.7000,133.8833"/>

Above XAML code will render out UI in visual studio canvas designer as shown below.


Step 6: Now its time to create our code behind class MainPage.xaml.cs  in C#

First import the following names spaces

using SP = Microsoft.SharePoint.Client;
using Microsoft.Maps.MapControl;

then create the following class level members

SP.ClientContext _context;
SP.Web _currentWeb;
SP.List _locationList;
SP.ListItemCollection _locations;

Then we need to create the UI loaded routed event handler and obtain the SharePoint client context reference so that we can access  sharepoint client object model. Once we get a reference to the object model (web) which corresponds to the SPWeb in server context, we are obtaining a client reference to the “factoylocations” custom list which stores our locations. We are going to access all the list items in that list. Therefore we need a CAML query with no filters. To reduce server calls we batch up all the server requests and make one single call to the server to obtain list data. In the end we make a Asynchronous call to the server by calling _context.ExecuteQueryAsync . At this point SharePoint Client object model calls WCF Services perform authentication,  calls service methods and obtain the results for us which frees us from lot of plumbing work. This is a massive developer productivity feature comes with SharePoint 2010.

void MainPage_Loaded(object sender, RoutedEventArgs e)
{    // get the current client context
    _context = SP.ClientContext.Current;
    // get a reference to the current web
    _currentWeb = _context.Web;
    //get a reference to the list that contains locations
    _locationList = _currentWeb.Lists.GetByTitle("FactoryLocations");
    //we need to obtain all items in the location list.
    SP.CamlQuery query = new SP.CamlQuery();
    query.ViewXml = "<View></View>";
    _locations = _locationList.GetItems(query);
    //batch up the work to perform
    _context.ExecuteQueryAsync(clientRequestSucceeded, clientRequestFailed);

Once the sever request is completed we need a call back method which needs to run in the UI thread. Our clientRequestSucceeded callback method looks as below. Here we iterate through the listitem collection and create Pushpin for each location listitem.

Dispatcher.BeginInvoke make sure that our anonymous method get executed in the client UI thread.

/// <summary>
/// Event handler for get request succeeded
/// </summary>
private void clientRequestSucceeded(object sender, SP.ClientRequestSucceededEventArgs e)
    //execute the code in the UI thread as a anonymous method
    Dispatcher.BeginInvoke(() =>
          foreach (SP.ListItem loc in _locations)
             Location location = new Location((Double)loc["Latitude"], (Double)loc["Longitude"]);
             // The pushpin to add to the map.
             Pushpin pin = new Pushpin();
             pin.Location = location;
             // Adds the pushpin to the map.
      catch (Exception ex)

Failed request handler looks as below.

/// <summary>
/// Event handler for get request failed
/// </summary>
private void clientRequestFailed(object sender, SP.ClientRequestFailedEventArgs e)
 {   //execute the code in the UI thread as a anonymous method
   Dispatcher.BeginInvoke(() =>
   { MessageBox.Show(string.Format("Error occured in displaying factory locations:  {0}", e.Message)); }

Finally we need to make sure that loaded event is wired up to the main page constructor as shown below

public MainPage()
    Loaded += new RoutedEventHandler(MainPage_Loaded);

Step 7: At this point we can build the project and we should be able to see the SilverLight .xap file get generated. Next Add a empty SharePoint project to the solution. Point the URL to the site collection you are working with and Select Deploy as a SandBoxed solution option. SandBoxed SharePoint Solution Package files (WSP files) that are limited in what they can do and in the server resources they can use. What they can do is limited using Process Isolation and Code Access Security limited to the SharePoint Site. The resources they can use are limited by process monitoring, logging and log aggregation. Executing Code in the Sandbox Executing Code in the Sandbox The Sandboxed Solutions Service provides for a complete isolation system that ensures code running in a sandboxed solution cannot reach out to access information beyond the scope of the deployment. Specifically, sandboxed solutions will not be able to make updates to the SharePoint object model beyond the scope of the SPSite object. Farm level and web application level changes are allowed only for read operations. This gives best option for developers as well as farm administrators.


Step 8: Add a module a New Item to the SharePoint Project. In this case I call it FactoryMapModule. This module item is used to include the .xap file into the SharePoint solution. Therefore we need to tell the module, the location of our .xap file. Right click the module item –> go to properties > Select Output References. In the Dialog box Click Add button. Select your SilverLight Project Name. In this case it was SLFactoryMap. Also select the deployment type as ElementFile. Finally Click OK.

Now we have our .xap file generated by the SilverLight project included as a dependency for our module.


Now our solution structure should look as shown below.


Step 9: Update element.xml file of the module item. add a File section and include the XAP file path (source path) and Url (relative URL after deployment)


Step 10: At this point we are ready to build the project and deploy the SharePoint solution. Right Click the SharePoint project and Select “Deploy” option. This action will build our project, Create a .wsp Solution file and deploy the solution into SharePoint solution gallery in the specified site collection and activate the feature. Now we can go to the solution galley as the Site Collection administrator and view our solution in the solution store.

Go to Site Settings –> Galleries –> Solutions



Step 11: Create a custom list in the site with the name “FactoryLocations” with two columns named “Longitude” and “Latitude” and fill some sample data init. Our web part is going to consume this data when rendering the map.

Step 12: Add the Silverlight web part to a web part page. SharePoint 2010 provides a wrapper web part (under media and content category) to host Silverlight .xap files. The concept  is similar to SmartPart approach we have in WSS3/MOSS 2007.  Therefore we need to add this wrapper  web part first into the page that you want to place the Silverlight web part as shown below.


Step 13: Specify the relative path of the .xap file to the Silverlight web part. In this sample the .xap file is located in the root of the site collection.

Note: Also set the Chrome type to None in the Silverlight web part properties.(This is a workaround for a bug present in Beta 2 in rendering the title of Silverlight web part)


Now we should be able to visualize the map control in our Silverlight Webpart with locations specified in our list marked as Pinpoints. The sandbox worker process is where your actual code runs! This is in contrast to having the code run inside of w3wp.exe. This is why you don’t have to restart the application pool every time your redeploy an sandbox solution. If you wish to debug a sandbox solution, and for any reason why F5 debugging is not working, you will need to attach your visual studio debugger to the SPUCWorkerprocess.exe process.

As we have seen the power of Silverlight 3.0 along with SharePoint 2010 client object model enhance developer productivity and open up lot of opportunities to build rich interactive applications leverage on SharePoint 2010 platform.

Setting up a SharePoint 2010 Beta 2 Development Environment

In this post I will go though my SharePoint 2010 Beta 2 setup experience as development environment. Here I am setting up a single server farm with AD, SharePoint and SQL sitting in a single development box. I am using Windows 2008 R2  (64bit) as the OS and following is the high level initial steps that I‘ve taken to configure the base OS. The full list of Hardware and Software Requirements can be found in Determine hardware and software requirements (SharePoint Server 2010) and Jeremy Thake has a excellent Wiki page outlining the detail steps on SharePointDevWiki.

1) Add Active Directory Domain Services server role and Configured the Domain using Active Directory Domain services installation wizard. Also add add Web Server (IIS) role and Application Serve role.

2) Create Service Accounts

It is strongly recommended to create domain accounts and use them as service accounts. You need to create at least the following accounts in Active Directory. I will be using local Administrator as the setup account for this dev environment and the following accounts.

Account type Account name
Farm Account or DB Access Account SpSqlService
Service Account SpService

For detail description of required accounts and permissions refer to Administrative and service accounts required for initial deployment .Additionally you should create for every service a separate service account in order to meet least-privilege security best practice.

3) Disable the loop back adapter check using this PowerShell script  written by  Michael Blumenthal. You might have to temporary change the power shell execution policy  from remotesigned (Default) to unrestricted using “set-executionpolicy unrestricted” before attempting to run the the script. 

Why we need this?

Recently Microsoft released an update, which prevents that you can log on locally to a website which has a FQDN. The following  KB article describes the issue. You receive error 401.1 when you browse a Web site that uses Integrated Authentication and is hosted on IIS 5.1 or a later version

To resolve you can disable the loopback check completely, or just for the used FQDNs. Alternatively If you have more then one server in you farm a better way to do so is a custom stsadm extension. Gary Lapointe has written such an extension. Take a look at it: Setting Back Connection Host Names for SharePoint 2007 Using STSADM. I used the PowerShell script.

4) Install SQL Server 2008 with windows authentication and set all services to run under SpSqlService dedicated account. Also Install SQL Server 2008 SP1 and apply the following patch : KB976761 

5) Download and Install Microsoft Chart Controls for Microsoft .NET Framework 3.5

6) Download and install  Microsoft "Geneva" Framework Runtime (2008R2\x64\AdfsSetup.exe)

7) Download and Install ADO.NET Data Services v1.5 CTP 2 

Note: If you are using Windows Server 2008 you will need to download and install following additional components (Windows Server 2008 SP 2, Windows Management Framework With PowerShell 2 and WCF Fix )

8 ) Run SharePoint Server Setup and Select “Install Software prerequisites” option


One all software requirements are satisfied you should see the following screen.



9) Now we are ready to Install SharePoint Server 2010 and setup the farm environment. Run the Setup and follow the Wizard. When you come to Installation Type “Server Farm” option.


10) Select “Complete” option from for the Server Type



11). At the end of the installation Check Run Configuration Wizard option and Click Close button.


12) This will open Run Confuguration Wizard. When you come to Connect to a server farm Select “Create a new server farm option”


13) Specify the Database server name and specify the domain sql service user account that we have created earlier.


14) Specify a passphrase


15) Accept the defaults for the Central Administration Web Application


16) Verify the Configuration details and proceed



Now we have finished running SharePoint Configuration Wizard. This has created core SharePoint Databases,Web Apps and Associated App pools and Services.

Following are the application pools that has been created at this point.

App pool Identity Description
SharePoint Central Administration v4 SharePoint Farm Account It’s responsible for Central Administration on the machines hosting it
The SharePoint Topology Service Application SharePoint Farm Account The app pool name in IIS is a GUID, in SharePoint it is SharePoint Web Services System.This hosts the Topology Service Application, which is known as the Application Discovery and Load Balancer Service Application
The SharePoint Security Token Service Application SharePoint Farm Account The app pool name in IIS is a GUID, in SharePoint it is SecurityTokenServiceApplicationPool
SharePoint Web Services Root LocalService Status: Stopped This hosts the SharePoint Web Services IIS Web Site and is on every machine in the farm. This is the host web site for Service Applications. Service Applications WCF end points  are hosted in here.


17) Launch Farm Configuration Wizard from Central Administration. This will provision Service Applications, and Allows you to create top level content web application and site collection.


Specify the service account for running farm services.


Create a root level Site Collection



After farm configuration wizard is finished will we have the following additional Application Pools.

App pool Identity Description
SharePoint Web Services Default SharePoint Services Account (spService) Hosts all the other Service Applications
SharePoint – 80 SharePoint Services Account (spService) This is the default application pool used to host end user Content Web Application


Now we have working site collection for development purposes. Additionally if you are going to develop sandbox solutions you will need to start SharePoint Foundation User code Service as described in Configure a farm for sandboxed solutions.

Hope this helps!