A Lap around business Connectivity Services in SharePoint 2010

Here’s is my slide deck and the Resources from the SharePoint Saturday Sydney 2011. Thank you for those who attended and voted for this session.

BCS Overview
Plan to upgrade BCS
BCS Resource Centre
BCS Video – Secure Store
BCS Security Overview
How BCS & Azure play together 

I am Speaking at SharePoint Saturday Sydney on 06/08/2011

image

I am pleased to announce that I will be presenting at the next SharePoint Saturday Sydney on 06/08/2011.  I will be presenting on Business Connectivity Services (BCS)

BCS is an umbrella term which incorporates set of awesome technologies and tools for both developers and power users to build business applications that natively fits into SharePoint 2010.

Here is an abstract of my session:

Business Connectivity Services (BCS) in SharePoint 2010 provides new ways to connect and integrate with external data such as your own SQL databases. BCS enhances the SharePoint platform’s capabilities with out-of-box features in the key areas of presentation, connectivity, tooling and lifecycle management that streamline the development of solutions with deep integration of external data and services. Come learn what’s available out of the box and how to bring your .NET skills to seamlessly integrate and surface external line of business data through SharePoint 2010

 

Check out the strong session line up here http://bit.ly/19Feqc .See you there!


Lotus Notes to SharePoint 2010 Migration

Recently I was engaged in a large scale Lotus Notes to SharePoint 2010 Migration project for a client in the travel/aviation industry vertical . Lotus Notes is a multi-user collaborative client-server application environment competing with Microsoft SharePoint 2010 suite of products /Microsoft Outlook/ Office workspace 2010. There are numerous legacy Lotus-Notes implementations which sometimes can be a overhead to main overtime and scale. This post summarize the approach taken to migrate such a application into SharePoint 2010.

The legacy Notes Applications in the context was designed and developed few years back was not quite helping accelerating the business outcomes for the client. Therefore migrating those applications onto a  single platform which can centrally manage and maintain over time was timely.

 Notes Architecture

The Domino product is an integrated messaging and Web application software platform. It provides a robust environment for creating Web applications that support workflow, RDBMS connectivity, and collaboration. Domino is ideal for creating applications that allow "sometimes connected", multiple users to cooperatively process unstructured information.The term Lotus Notes is now used to refer to the underlying client-server architecture that the product is built upon.

Storage Model

The storage model of Lotus notes storage area a "Note" (These Notes will also be referred to as documents, a term more familiar to users). In each Note we have "items"; and each item has its own "item name". ( and an item may be a list of values) .This iterative structure of one element containing other elements has been referred to as the Notes container model. The "Notes Storage Facility" and the file that creates on the Notes Client is given an extension of .NSF.The Notes data model contains no concept of a "record type". All documents in a database are equivalent as far as Notes in concerned.

User Interface

Forms is the mechanism that Notes Client (Desktop Client) use to show the user what is in a document.The areas on the form that display the values stored in the items on a document will be called fields.This listing of item values from a set of documents in a column format is a Notes View.

Agents

You can program Notes to perform tasks automatically using agents (also known as macros). Agents can help you perform repetitive tasks, such as managing documents and sending memos. Agents can complete almost any action you can do manually in your databases.When you create an agent, you can set options to specify:

  • When the agent should run. (ie. a document has been changed)
  • Which documents in the database the agent should run on. You can target all or any subset of documents in the database for the agent to run on.
  • What actions the agent should complete. You can choose actions from a list or use Notes formulas or programming scripts and languages to create actions.

Tools for the Job

  • Visio 2010 Premium – Modelling new the process workflows with business analysts/domain experts
  • SharePoint Designer 2010 – Authoring the workflows/ customising list forms etc.
  • Visual Studio 2010 – Create Solution Artefacts/ Custom event handlers/ create features, packaging sandbox .wsp solutions
  • SPServices jQuery library for SharePoint web services – UI extensibility helpers / Cascaded dropdowns etc.
  • Bit Bucket  with Mercurial/ VisualHG -Distributed Version Control (DVCS)
  • Quest Notes Migrator for SharePoint – For Data Mapping and Migration from Notes fields to SharePoint Lists.

Migration Map

Each Notes Application was mapped to Corresponding SharePoint 2010 Custom Application in its own Site Collection with its own Logical Security boundaries.

clip_image001

Process

The new applications were architected and implemented leveraging the new platform capabilities of SharePoint 2010 and mapped the required data into the new application platform. The Quest Notes migrator did a good job for us by nicely abstracting Lotus notes internals by allowing us to defined Extract-Transform-Load (ETL) operations including user mappings from Domino to AD accounts. Where Transformations were complex Extract-Load-Transform (ELT) approach was used with custom transformation rules with PowerShell / C#

High level process followed for migrating each Notes application to SharePoint 2010 platform involved following steps

  1. Consult Notes application owners (product owners) in each department and created a prioritised product backlog
  2. Modelled the new workflows /rules with Visio 2010 premium with business users
  3. Built & Prototyped the new application with the best tool fit for the job. (Visual Studio 2010   -  Author and deploy base level custom site columns/content types/list definitions/list instances/custom event handlers , SharePoint Designer 2010 – forward engineer the Visio workflows / XSLT List form customization )
  4. Data mapping jobs were created/tested with Quest Notes migrator for SharePoint in the dev environment with a subset of production data
  5. Repeated steps 3 & 4 with product continues reviews/feedback
  6. Reverse engineer reusable workflows into features and list form  into solution artefacts into SharePoint Solution packages and deploy into production environment as SharePoint features
  7. Execute the Data migration job on the production environment

Hope this post would benefit those who interested in migrating legacy lotus notes applications to SharePoint 2010.


I am presenting at SharePoint Saturday Perth

image

 

I am presenting on the topic  “Unboxing Claims based authentication in SharePoint 2010” at the SharePoint  Saturday Perth on 9th April.

If you are curious about what’s new SharePoint 2010 security model I recon this will clarify some of your questions. we will explore how SharePoint 2010 has undergone a shift in identity and access control by adopting the claims-based Identity model offered by Windows Identity Foundation (WIF). You will see how SharePoint 2010 implements the extensibility points of WIF and achieves a standards based identity solution that is designed for heterogeneous identity environments for both sign-in and services (SOA) highlighting its benefits, implications, and implementation that can help to drive greater interoperability with SharePoint and other systems.

Check out other great session here

http://www.sharepointsaturday.org/perth/Pages/meetings.aspx

See you there!


Improving the quality of your Custom SharePoint 2010 Solutions via continues Integration(CI) with Team foundation Server 2010

This is a follow up blog post based on the content of my presentation on “SharePoint2010-ALM” presented at Melbourne SharePoint User Group meeting (#MOSSIG) on 23rd March 2011. Since then I’ve received number of requests from the community for sharing the content and the references. Therefore I thought of writing a little detailed post about this topic and here we go.

Continues integration is a critical process in modern day product development in improving the quality of your solution from 1st Sprint (iteration) of your product development if you are following an agile approach like Scrum.  This allows you to early detect and rectify the issues in your solution which increases the chances of success without having to go through painful integration issues and surprises at the last minute in your product development cycle.

How does it improve the quality of my custom SharePoint solution?

Well if you have a mature fully automated continues build (CI build) and a Nightly build you are in the correct path and your SharePoint development team enjoys the following in every sprint.

  1. It enforces you to promote the code from Dev -> Test-> Staging -> Production in the correct way in the first place. Otherwise automation will not be possible.
  2. Every code check in by a developer will kick-off an automated code validation, run unit tests (if any) and build the SharePoint solution (.wsp files) in the Team foundation server. 
  3. A Nightly build will do every thing in step 2 plus Add, Deploy and Activate your SharePoint solution packages into your test farm and can run and automated coded UI tests and automated load tests (if any) for you.

here is the sketch workflow.

image

Why was this approach not so common in SharePoint Development?

The Lack of SharePoint development tooling support with VS2008 contributed significantly. This is also coupled with the natural complexity arises when you develop on a platform rather than a development Framework. When it comes to SharePoint development, you may not necessarily develop with Visual Studio environment. You can develop quite powerful collaborative solutions and branding work with SharePoint Designer as no code solutions. But unless we convert those customisations into solution artefacts we will not be able to cleanly source control and maintain the solutions which breaks the fundamental concept of Application Lifecycle Management (ALM).

The Challenge

SharePoint 2010 has evolved and positioned as a single unified platform for most originations playing a mission critical role for day to day operations. With the introduction of Business connectivity Service (BCS), Client OM, Sandbox solutions, WCF based Service Applications and Claims based authentication SharePoint 2010 has also significantly evolved as application development platform which opens up a number of extensibility options for the developers.

We also have integrated set of tools built into Visual Studio 2010 for SharePoint Development. So the challenge is how to use these tools effectively and efficiently in a team based environment to build your product or solution and deploy into those shared mission critical environments confidently with consistent quality? This post describes how to implement an automated continues build with the help of following tools to validate, build and deploy SharePoint solutions.

Step1: We need to get all your customisations as solution artefacts. Reverse engineer your SharePoint designer customisations and featurize them. CKSDev Branding SPI can help you to deploy master pages and CSS as Features. Save all SharePoint Designer authored workflows as Reusable workflows as .wsp from SharePoint Designer and bring it to Visual Studio environment.

image

Step2: Create a new Team project and add your solution into source control under TFS. For the purpose of this post I am using a simple VS2010 solution name SPHelloWorld. For Code analysis, SPDisposeCheck ruleset needs to be in source control and can be shared across multiple projects with in your repository. 

image

Step3: Enable Code analysis for you solution via project poperties and select SPDisposeCheck RuleSet from your local TFS workspace as shown below.

image 

Step4: Note that Code Analysis now runs SPDisposeCheck rules as part of compilation and both Dispose and Do not Dispose rules are fired when detecting the the poor code.

image

 

Step5: Now open the Default Build Template for the project. We are going to extend the default build definition file to deploy our solution to the Test SharePoint Farm remotely. TFS 2010 is using windows workflow foundation 4 based

.xaml template.  Drag the activities from the left side bar tool box and assemble the activities as shown below just after packaging is finished.

The high level tasks performed by this segment of the workflow are

  1. Build the Solution
  2. Run Code Analysis with SPDisposeCheck on the TFS Server. If this code analysis fails our build would fail at this point.
  3. Create .wsp solution package
  4. Check the user specified deployment flag (This is a work initiation parameter)
  5. Write a log message to the log file
  6. Map a Admin share drive to the Target SharePoint Server (Test Farm) with the user specified credentials.
  7. SET the STSADM path on the target server
  8. Stop V4 Admin service in the target farm. (This uses PSEXEC to remotely execute STSADM on the target server)
  9. Copy each .wsp file to the target SharePoint  server and add, deploy and execute the timer job in user specified sequence.
  10. Finally turn on the V4 Admin timer service.

1

If you are interested the customised TFS2010 workflow template for building and deploying .wsp solutions can be downloaded from my SkyDrive here

Now define the additional parameters required for the workflow initiation that needs to be supplied by the user when starting a new Team build.

image

Now set activity properties for each activiry. As a sample I’ve shown below the properties set for the  “Add solution” InvokeProcess workflow activity. Once all properties are set WF4 validates your workflow. Once all validations are passed, checkin your custom build definition template into TFS.

image

 

Step6: Author a new Build definition based on our updated workflow template.

image

Some important parameters that you need to set are shown below.

image

image

Step7: It’s time to Queue a new Build. Now assume a scenario where the developer rushes to checkin by overriding the Check-in Policies set by the build master.

image

and here is the result : TFS build agent decided to fail your build because your code did not pass the code analysis. No .wsp’s produced.

image

So developer now fix the code with proper disposing as shown below and check-in the code again.

 

image

And after few seconds we get this TFS Build status. All green!

image

Opening the build log gives us the workflow execution log as shown below. Note how STSADM has been invoked remotely via psexec.exe on the TFS build agent.

 

image

and see below the end result in the central admin in our test SharePoint Farm.

image

On a final note “the right tools make all the difference”. But tooling it self may not helps you to build a quality product. The process matters. TFS 2010 support you with your chosen process and allows you to tailor the default process templates if required. You have the chance to select the process template when you create a new Team project as shown below. I highly encourage you to check out the new “Microsoft Visual Studio Scrum 1.0” template. This template has been co-developed by Microsoft in collaboration with the Scrum community. The template guides you with scrum based agile product development. I’ve highlighted a few below.

  • Creating effective user stories
  • Creating prioritizing grooming product backlog
  • Agile estimation and Sprint planning
  • Scrum artefacts such as sprint burndown and release burndown chart

image

And here is the high level  process behind the template.

image

 

The correct tools coupled with the correct process will take your product/solution to the next level faster and smoother and make you a successful SharePoint developer and hence a successful product/solution delivery Team.

Where to go from here?

Enjoy!


Consuming Windows Azure DataMarket with SharePoint 2010 via Business Connectivity Services

Windows Azure MarketPlace DataMarket is a service that provides a single consistent marketplace and delivery channel for high quality information as cloud services. It provides high quality Statistical and Research data across many business domains from financial to Health Care.. You can read more about it here

How do we consume this content in an enterprise scenario? In this post we are going to look at how to integrate Azure Marketplace Datamarket with SharePoint 2010 via Business Connectivity Services via .NET Connector as a external list. This enable us to surface Azure DataMarket data through SharePoint 2010 lists which opens up world of possibilities including going offline with SharePoint workspace and participation with workflows, enterprise search etc..

Let’s first start with Azure Datamarket. Logon to https://datamarket.azure.com with your live id and subscribe to a dataset that that you are interested in. I selected European Greenhouse Gas Emissions DataSet. Also note down your account key in the Account Keys tab as we will need this information later on.

Untitled1

The Dataset details give you the input parameters and results as follows.

Untitled

You can navigate to the Data set link with your browser and view the oData feed. We are going to consume this oData feed with WCF Data Services. This allows us to use Linq to filter the data.

LinqPad is a nifty tool to develop and test LINQ Query Expressions. It allows you to connect to Windows Azure DataMarket Datasets and develop the queries against the data as shown below.

clip_image006

Now lets move on to VS2010 and create a BCS Model project as shown below.

 

image

Rename the Default Entity in the BDC Model designer and set the properties that you want to expose as the external content type

image

Once all properties are set DBC Explorer displays my model as shown below.

image

Now our BDC model is ready. Now lets implement ReadList and ReadItem BDC operations.

First Add a Service Reference to your project. Set the Address to the location of your Azure DataMarket

Dataset and Discover the oData Feed as shown below.

image

After the service reference is added , we can implement the ReadItem method as shown below.

 

image

Note that Service Reference Context requires your live ID and account Id pass as credentials in to the service call.

image

Once all operations are implemented Click F5 to build and Deploy your BDC model  into SharePoint. Once deployed

log on to Central Administration and go to Manage Service Applications and select Business Connectivity Services

and view the Details of EmissionStat Content Type being Deployed. You might wanted to permissions for this content

type at this point.

image

Now the external content type is defined all we need is a external List based on this content type. Fire up SharePoint

Designer and Create an external list referencing the new External content type just created

image

Once the list is created navigate to the list in the portal. Note that Windows Azure market place Data Market content

now surfaced through SharePoint 2010 list as shown below. The VS2010 solution associated with this post can be download here

image

Viewing item details shows full item details using our ReadItem BCS operation.

image

Integrating Structured data (Azure) with Unstructured content (Collaborative Data) in SharePoint allows you  to create high business valued

composites for informed decision making purposes at all levels in your organisations. Happy Cloud Collaboration!


Leverage and Extend Claims based identity in SharePoint 2010

Windows Identity Foundation (WIF) is the platform on which SharePoint 2010 claims authentication is based. WIF, which is fully supported in SharePoint 2010, ADFS 2.0, ASP.NET, Windows Communication Foundation (WCF), and any other .NET application you care to develop, provides the infrastructure necessary to generate, transmit, and process claims-based identity in a simple and straightforward manner. It removes the roadblocks imposed by legacy authentication schemes like NTLM and Kerberos and puts control directly into the hands of developers, users, and IT security professionals. long story short, it’s a framework written to help solve identity issues common in the of cloud computing and service-oriented architecture.

The idea of claims based identity is one that many people are willing to try. Getting accurate information out there to the public though does take time.

The important point is this is based on industry standards.  Many different entities are on board along with Microsoft in this matter. The digital world continues to give us new opportunities and those involved believe that this will help all of us to get the most out of it. There is a strong foundation in place to continue building upon. The use of AD FS v2, CardSpace, and Windows Identity Foundation are all important pieces of this puzzle.

As a demonstration of these capabilities, I’ll show how SharePoint 2010, WCF,and WIF can be put together to solve the identity delegation problem. In this demo session part 1 I start establishing the trust relationship between ADFS 2.0 and SharePoint with PowerShell and demonstrate how the claims get into SharePoint.Then we build and deploy a claims viewer Webpart with WIF programming model. In part 2 We start with a web service that is front-ending line-of-business information stored in a SQL database. Then, we’ll configure it to use WIF to request the calling user’s claims from SharePoint and process the token so that authorization decisions can be made. we’ll surface this information in SharePoint 2010 as an External Content Type using Business Connectivity Services (BCS).

This session was presented in SharePoint Saturday Melbourne 2010  in Oct 2010  and MOSSIG user group Nov 2010.

images

Click here to view a recorded screencast of this session. 

The slide deck of this session is shared here

The SharePoint Claims Viewer Visual WebPart VS2010 solution,  Claims based WCF Service VS2010 solution and necesssary powershell scripts to add and remove identity provider in SharePoint 2010 can be downloaded from here.
 
I encourage you to look at following links for further information about this topic