The WordPress.com stats helper monkeys prepared a 2011 annual report for this blog.
Here’s an excerpt:
The concert hall at the Syndey Opera House holds 2,700 people. This blog was viewed about 10,000 times in 2011. If it were a concert at Sydney Opera House, it would take about 4 sold-out performances for that many people to see it.
Here’s is my slide deck and the Resources from the SharePoint Saturday Sydney 2011. Thank you for those who attended and voted for this session.
|Plan to upgrade BCS|
|BCS Resource Centre|
|BCS Video – Secure Store|
|BCS Security Overview|
|How BCS & Azure play together|
Recently I was engaged in a large scale Lotus Notes to SharePoint 2010 Migration project for a client in the travel/aviation industry vertical . Lotus Notes is a multi-user collaborative client-server application environment competing with Microsoft SharePoint 2010 suite of products /Microsoft Outlook/ Office workspace 2010. There are numerous legacy Lotus-Notes implementations which sometimes can be a overhead to main overtime and scale. This post summarize the approach taken to migrate such a application into SharePoint 2010.
The legacy Notes Applications in the context was designed and developed few years back was not quite helping accelerating the business outcomes for the client. Therefore migrating those applications onto a single platform which can centrally manage and maintain over time was timely.
The Domino product is an integrated messaging and Web application software platform. It provides a robust environment for creating Web applications that support workflow, RDBMS connectivity, and collaboration. Domino is ideal for creating applications that allow "sometimes connected", multiple users to cooperatively process unstructured information.The term Lotus Notes is now used to refer to the underlying client-server architecture that the product is built upon.
The storage model of Lotus notes storage area a "Note" (These Notes will also be referred to as documents, a term more familiar to users). In each Note we have "items"; and each item has its own "item name". ( and an item may be a list of values) .This iterative structure of one element containing other elements has been referred to as the Notes container model. The "Notes Storage Facility" and the file that creates on the Notes Client is given an extension of .NSF.The Notes data model contains no concept of a "record type". All documents in a database are equivalent as far as Notes in concerned.
Forms is the mechanism that Notes Client (Desktop Client) use to show the user what is in a document.The areas on the form that display the values stored in the items on a document will be called fields.This listing of item values from a set of documents in a column format is a Notes View.
You can program Notes to perform tasks automatically using agents (also known as macros). Agents can help you perform repetitive tasks, such as managing documents and sending memos. Agents can complete almost any action you can do manually in your databases.When you create an agent, you can set options to specify:
- When the agent should run. (ie. a document has been changed)
- Which documents in the database the agent should run on. You can target all or any subset of documents in the database for the agent to run on.
- What actions the agent should complete. You can choose actions from a list or use Notes formulas or programming scripts and languages to create actions.
Tools for the Job
- Visio 2010 Premium – Modelling new the process workflows with business analysts/domain experts
- SharePoint Designer 2010 – Authoring the workflows/ customising list forms etc.
- Visual Studio 2010 – Create Solution Artefacts/ Custom event handlers/ create features, packaging sandbox .wsp solutions
- SPServices jQuery library for SharePoint web services – UI extensibility helpers / Cascaded dropdowns etc.
- Bit Bucket with Mercurial/ VisualHG -Distributed Version Control (DVCS)
- Quest Notes Migrator for SharePoint – For Data Mapping and Migration from Notes fields to SharePoint Lists.
Each Notes Application was mapped to Corresponding SharePoint 2010 Custom Application in its own Site Collection with its own Logical Security boundaries.
The new applications were architected and implemented leveraging the new platform capabilities of SharePoint 2010 and mapped the required data into the new application platform. The Quest Notes migrator did a good job for us by nicely abstracting Lotus notes internals by allowing us to defined Extract-Transform-Load (ETL) operations including user mappings from Domino to AD accounts. Where Transformations were complex Extract-Load-Transform (ELT) approach was used with custom transformation rules with PowerShell / C#
High level process followed for migrating each Notes application to SharePoint 2010 platform involved following steps
- Consult Notes application owners (product owners) in each department and created a prioritised product backlog
- Modelled the new workflows /rules with Visio 2010 premium with business users
- Built & Prototyped the new application with the best tool fit for the job. (Visual Studio 2010 - Author and deploy base level custom site columns/content types/list definitions/list instances/custom event handlers , SharePoint Designer 2010 – forward engineer the Visio workflows / XSLT List form customization )
- Data mapping jobs were created/tested with Quest Notes migrator for SharePoint in the dev environment with a subset of production data
- Repeated steps 3 & 4 with product continues reviews/feedback
- Reverse engineer reusable workflows into features and list form into solution artefacts into SharePoint Solution packages and deploy into production environment as SharePoint features
- Execute the Data migration job on the production environment
Hope this post would benefit those who interested in migrating legacy lotus notes applications to SharePoint 2010.
Improving the quality of your Custom SharePoint 2010 Solutions via continues Integration(CI) with Team foundation Server 2010Posted: March 31, 2011
This is a follow up blog post based on the content of my presentation on “SharePoint2010-ALM” presented at Melbourne SharePoint User Group meeting (#MOSSIG) on 23rd March 2011. Since then I’ve received number of requests from the community for sharing the content and the references. Therefore I thought of writing a little detailed post about this topic and here we go.
Continues integration is a critical process in modern day product development in improving the quality of your solution from 1st Sprint (iteration) of your product development if you are following an agile approach like Scrum. This allows you to early detect and rectify the issues in your solution which increases the chances of success without having to go through painful integration issues and surprises at the last minute in your product development cycle.
How does it improve the quality of my custom SharePoint solution?
Well if you have a mature fully automated continues build (CI build) and a Nightly build you are in the correct path and your SharePoint development team enjoys the following in every sprint.
- It enforces you to promote the code from Dev -> Test-> Staging -> Production in the correct way in the first place. Otherwise automation will not be possible.
- Every code check in by a developer will kick-off an automated code validation, run unit tests (if any) and build the SharePoint solution (.wsp files) in the Team foundation server.
- A Nightly build will do every thing in step 2 plus Add, Deploy and Activate your SharePoint solution packages into your test farm and can run and automated coded UI tests and automated load tests (if any) for you.
here is the sketch workflow.
Why was this approach not so common in SharePoint Development?
The Lack of SharePoint development tooling support with VS2008 contributed significantly. This is also coupled with the natural complexity arises when you develop on a platform rather than a development Framework. When it comes to SharePoint development, you may not necessarily develop with Visual Studio environment. You can develop quite powerful collaborative solutions and branding work with SharePoint Designer as no code solutions. But unless we convert those customisations into solution artefacts we will not be able to cleanly source control and maintain the solutions which breaks the fundamental concept of Application Lifecycle Management (ALM).
SharePoint 2010 has evolved and positioned as a single unified platform for most originations playing a mission critical role for day to day operations. With the introduction of Business connectivity Service (BCS), Client OM, Sandbox solutions, WCF based Service Applications and Claims based authentication SharePoint 2010 has also significantly evolved as application development platform which opens up a number of extensibility options for the developers.
We also have integrated set of tools built into Visual Studio 2010 for SharePoint Development. So the challenge is how to use these tools effectively and efficiently in a team based environment to build your product or solution and deploy into those shared mission critical environments confidently with consistent quality? This post describes how to implement an automated continues build with the help of following tools to validate, build and deploy SharePoint solutions.
- VS2010 (with SP1 adds support for 64 bit unit testing and Intellitrace with historical debugging capabilities for SharePoint Development)
- VS2010 SharePoint Power tools (Sandbox code validation, Visual Web Part as Sandbox Solutions)
- TFS 2010
- CKSDev (productivity tools),
- Pex & Moles (unit testing)
- SPDisposeCheck (build validation)
- Sysinternals PSEXEC (remote deployment)
Step1: We need to get all your customisations as solution artefacts. Reverse engineer your SharePoint designer customisations and featurize them. CKSDev Branding SPI can help you to deploy master pages and CSS as Features. Save all SharePoint Designer authored workflows as Reusable workflows as .wsp from SharePoint Designer and bring it to Visual Studio environment.
Step2: Create a new Team project and add your solution into source control under TFS. For the purpose of this post I am using a simple VS2010 solution name SPHelloWorld. For Code analysis, SPDisposeCheck ruleset needs to be in source control and can be shared across multiple projects with in your repository.
Step3: Enable Code analysis for you solution via project poperties and select SPDisposeCheck RuleSet from your local TFS workspace as shown below.
Step4: Note that Code Analysis now runs SPDisposeCheck rules as part of compilation and both Dispose and Do not Dispose rules are fired when detecting the the poor code.
Step5: Now open the Default Build Template for the project. We are going to extend the default build definition file to deploy our solution to the Test SharePoint Farm remotely. TFS 2010 is using windows workflow foundation 4 based
.xaml template. Drag the activities from the left side bar tool box and assemble the activities as shown below just after packaging is finished.
The high level tasks performed by this segment of the workflow are
- Build the Solution
- Run Code Analysis with SPDisposeCheck on the TFS Server. If this code analysis fails our build would fail at this point.
- Create .wsp solution package
- Check the user specified deployment flag (This is a work initiation parameter)
- Write a log message to the log file
- Map a Admin share drive to the Target SharePoint Server (Test Farm) with the user specified credentials.
- SET the STSADM path on the target server
- Stop V4 Admin service in the target farm. (This uses PSEXEC to remotely execute STSADM on the target server)
- Copy each .wsp file to the target SharePoint server and add, deploy and execute the timer job in user specified sequence.
- Finally turn on the V4 Admin timer service.
If you are interested the customised TFS2010 workflow template for building and deploying .wsp solutions can be downloaded from my SkyDrive here
Now define the additional parameters required for the workflow initiation that needs to be supplied by the user when starting a new Team build.
Now set activity properties for each activiry. As a sample I’ve shown below the properties set for the “Add solution” InvokeProcess workflow activity. Once all properties are set WF4 validates your workflow. Once all validations are passed, checkin your custom build definition template into TFS.
Step6: Author a new Build definition based on our updated workflow template.
Some important parameters that you need to set are shown below.
Step7: It’s time to Queue a new Build. Now assume a scenario where the developer rushes to checkin by overriding the Check-in Policies set by the build master.
and here is the result : TFS build agent decided to fail your build because your code did not pass the code analysis. No .wsp’s produced.
So developer now fix the code with proper disposing as shown below and check-in the code again.
And after few seconds we get this TFS Build status. All green!
Opening the build log gives us the workflow execution log as shown below. Note how STSADM has been invoked remotely via psexec.exe on the TFS build agent.
and see below the end result in the central admin in our test SharePoint Farm.
On a final note “the right tools make all the difference”. But tooling it self may not helps you to build a quality product. The process matters. TFS 2010 support you with your chosen process and allows you to tailor the default process templates if required. You have the chance to select the process template when you create a new Team project as shown below. I highly encourage you to check out the new “Microsoft Visual Studio Scrum 1.0” template. This template has been co-developed by Microsoft in collaboration with the Scrum community. The template guides you with scrum based agile product development. I’ve highlighted a few below.
- Creating effective user stories
- Creating prioritizing grooming product backlog
- Agile estimation and Sprint planning
- Scrum artefacts such as sprint burndown and release burndown chart
And here is the high level process behind the template.
The correct tools coupled with the correct process will take your product/solution to the next level faster and smoother and make you a successful SharePoint developer and hence a successful product/solution delivery Team.
Where to go from here?
- Disposing Best Practices
- Unit Testing SharePoint Solutions
- Unit Testing SharePoint with Pex & Moles
- How to Build SharePoint Projects with TFS Team Build
- Configuring TFS for building SharePoint and SPDisposeCheck (10min) by Jeremy Thake
- Scrum Resources
Windows Azure MarketPlace DataMarket is a service that provides a single consistent marketplace and delivery channel for high quality information as cloud services. It provides high quality Statistical and Research data across many business domains from financial to Health Care.. You can read more about it here
How do we consume this content in an enterprise scenario? In this post we are going to look at how to integrate Azure Marketplace Datamarket with SharePoint 2010 via Business Connectivity Services via .NET Connector as a external list. This enable us to surface Azure DataMarket data through SharePoint 2010 lists which opens up world of possibilities including going offline with SharePoint workspace and participation with workflows, enterprise search etc..
Let’s first start with Azure Datamarket. Logon to https://datamarket.azure.com with your live id and subscribe to a dataset that that you are interested in. I selected European Greenhouse Gas Emissions DataSet. Also note down your account key in the Account Keys tab as we will need this information later on.
The Dataset details give you the input parameters and results as follows.
LinqPad is a nifty tool to develop and test LINQ Query Expressions. It allows you to connect to Windows Azure DataMarket Datasets and develop the queries against the data as shown below.
Now lets move on to VS2010 and create a BCS Model project as shown below.
Rename the Default Entity in the BDC Model designer and set the properties that you want to expose as the external content type
Once all properties are set DBC Explorer displays my model as shown below.
Now our BDC model is ready. Now lets implement ReadList and ReadItem BDC operations.
First Add a Service Reference to your project. Set the Address to the location of your Azure DataMarket
Dataset and Discover the oData Feed as shown below.
After the service reference is added , we can implement the ReadItem method as shown below.
Note that Service Reference Context requires your live ID and account Id pass as credentials in to the service call.
Once all operations are implemented Click F5 to build and Deploy your BDC model into SharePoint. Once deployed
log on to Central Administration and go to Manage Service Applications and select Business Connectivity Services
and view the Details of EmissionStat Content Type being Deployed. You might wanted to permissions for this content
type at this point.
Now the external content type is defined all we need is a external List based on this content type. Fire up SharePoint
Designer and Create an external list referencing the new External content type just created
Once the list is created navigate to the list in the portal. Note that Windows Azure market place Data Market content
now surfaced through SharePoint 2010 list as shown below. The VS2010 solution associated with this post can be download here
Viewing item details shows full item details using our ReadItem BCS operation.
Integrating Structured data (Azure) with Unstructured content (Collaborative Data) in SharePoint allows you to create high business valued
composites for informed decision making purposes at all levels in your organisations. Happy Cloud Collaboration!
I’ve noticed that this issue came up few times in different environments where Team foundation server 2010 configured with out of the box SharePoint (WSS) installation. I thought of writing this post that might help someone having the same issue.
The Problem: SharePoint search was not working in the current production environment and when you search you get a error message saying
‘Your search cannot be completed because this site is not assigned to an indexer. Contact your administrator for more information’.
You also notice that the SharePoint search service is running as a windows service in the service console but not visible under farm services in the Central Administration.
The cause: Search services not properly configured registered in the SharePoint farm level.
The resolution steps:
1. Go to Central Admin as login as the farm Administrator
2. Go to Application Management tab
3. Select SharePoint Web Application Management heading | Content databases
4. Ensure your web application is the one selected
5. Select your content database name
6. Under Search Server – select your server (if your search server drop down is disabled please follow the additional steps below)
7. Go to 12 hive folder and open stsadm.exe utility
8. Run the following command psconfig.exe -cmd services -install
9. Then run stsadm -o spsearch -action start (specify the farm account if required)
This process will register the search services within the SharePoint (WSS) farm and search service will now appear under central administration. Now you can click the service service and create a indexing schedule. After indexing job is completed for the given web application you will notice that search results appear as expected.
Windows Identity Foundation (WIF) is the platform on which SharePoint 2010 claims authentication is based. WIF, which is fully supported in SharePoint 2010, ADFS 2.0, ASP.NET, Windows Communication Foundation (WCF), and any other .NET application you care to develop, provides the infrastructure necessary to generate, transmit, and process claims-based identity in a simple and straightforward manner. It removes the roadblocks imposed by legacy authentication schemes like NTLM and Kerberos and puts control directly into the hands of developers, users, and IT security professionals. long story short, it’s a framework written to help solve identity issues common in the of cloud computing and service-oriented architecture.
The idea of claims based identity is one that many people are willing to try. Getting accurate information out there to the public though does take time.
The important point is this is based on industry standards. Many different entities are on board along with Microsoft in this matter. The digital world continues to give us new opportunities and those involved believe that this will help all of us to get the most out of it. There is a strong foundation in place to continue building upon. The use of AD FS v2, CardSpace, and Windows Identity Foundation are all important pieces of this puzzle.
As a demonstration of these capabilities, I’ll show how SharePoint 2010, WCF,and WIF can be put together to solve the identity delegation problem. In this demo session part 1 I start establishing the trust relationship between ADFS 2.0 and SharePoint with PowerShell and demonstrate how the claims get into SharePoint.Then we build and deploy a claims viewer Webpart with WIF programming model. In part 2 We start with a web service that is front-ending line-of-business information stored in a SQL database. Then, we’ll configure it to use WIF to request the calling user’s claims from SharePoint and process the token so that authorization decisions can be made. we’ll surface this information in SharePoint 2010 as an External Content Type using Business Connectivity Services (BCS).
Click here to view a recorded screencast of this session.
The slide deck of this session is shared here