Thoughts about migrating content into Yammer from forums


This article hopes to cover the considerations and possible approach for planning and migrating existing forum content into Yammer. Yammer doesn’t provide any form of ‘migration’ or ‘import’ for other system content. They take the stance that the working paradigms created by Yammer are unique to the platform and that managing the transition from old system to Yammer is where the investments should be made. The Yammer article here describes their high level thoughts on the subject, key for me was the statement:

2. Start fresh with content

Our customers’ experience has shown that technical migrations of content are not generally recommended, as information on Yammer tends to be more ephemeral in nature – not necessarily needed as material for future reference. There is better potential for users to be invested in and engaged with the new network if they are empowered to shape its structure and content themselves.

The realities of Client engagements tends to be slightly less black and white, where most want to consider some form of carry across information into Yammer where they have existing knowledge management forums established. Many of our clients have established forums woven into the day to day behaviours, the information within them can’t be wilted on the vine so to speak.

One key takeaway before diving into the more detailed thoughts….

Consider why you are migrating, unless its aligned to a business benefit realisation or business goal then the effort of the migration is mis-aligned to its value to the organisation. Also consider what your new operating model is, don’t migrate an old operating behaviour along with the content, you MUST migrate the content into the new behaviours and approaches you aim to have using Yammer integrated within your organisation.

Understanding the source forum information

Whatever the technology platform involved whether it is SharePoint forums or another forum technology the key element to a successful migration is auditing and understanding the content within the current forums. Key elements to understand are as follows:

  • Users
  • Forums metadata such as Title, description and logo image
  • Security model
  • Thread content
    • Styles of thread, such as single origin, multi-thread
    • Thread parenting, which forum the earliest multi-origin thread comes from
    • Mentioning users
    • Mentioning metadata
    • Attachments
    • Links
  • Value add activities like adding additional tagging or additional metadata content

From there you need to plan the content destinations in Yammer. Not always will there be a one forum to group mapping. In fact, now may be a sensible time to ‘refactor’ the approach to segregation of the conversations to something that is more relevant or easier to understand.


The source system is unlikely to hold user accounts in the same fashion as Yammer. Therefore a mapping exercise is needed. Yammer users are keyed off the email address (the primary key as such).

A full listing of the users who have interacted with the forum needs to be generated. Alongside this a full extract of the Yammer users information should also be requested. In order to have all users within the Yammer Network you must establish the first date your network existed and pull the data from that date. The Yammer data export will only pull data where a user was interactive for that time period. This means you need to go a long way back for the first extract to get all the required user records. Subsequent users can be pulled in more recent extracts if you need to refresh and append data.

From this point you have the source and destination user lists. The next step is to produce a mapping between them. In general a good approach is to manipulate the records in the source user list to append the email address if possible. If you can get the source users to have a unique email address then you can map them directly against the Yammer user list.

From the mapping exercise you’ll be left with users in both lists who are not matched. Ignore the non-matches in Yammer side as they’re not really going to be a problem for migration as they have never interacted with the source forum. The non matched users in the source user list is where the focus should be placed. You have the following options:

  • Map all non-matched forum users to a generic account in Yammer. Something like Company Alumni, this has the advantage that it’s simple to map, and you can tie everything to one central user. During your Yammer rollout you can then inform the user base about the Alumni user and promote this as historic facts. The disadvantage of this approach is if there is value in knowing who made every thread post. Consider a situation where the discipline for the forum is more detailed and technical, users may still require to know who made which comment in order to trace that individual even outside of the organisation.
  • Map all non-matched forum users to newly created Yammer users. The advantage of this is that the user to thread relationship is maintained. Disadvantages is that you will have to consider how users are managed. If DIRSync is being performed this may be more complex. Also these users need to be active during migration and then suspended once the threads are added.

For the purposes of forum migration it is unlikely you need to consider the values within the user profiles and migrating those, but again this is worth discussing for your scenario.


Consider how you will be mapping in the existing forums. Do the existing forums contain threads which span wide topics? Could it be beneficial to split the existing forum into smaller segments, in which case the thread mapping to Yammer Groups needs to establish this. Also consider that the groups may not yet exist, so you might have to work out the creation and admin assignment as well. Be careful not to orphan any threads which are topics that no longer fit into the Yammer Group structure.

Also the big difference between Yammer Groups and many forum based technologies is that the Yammer Groups lack any form of hierarchy, so where a source forum may be grouping several sub sections the Yammer Groups don’t behave in this fashion. My thoughts here is that you should structure the Yammer Network to support the new working patterns you are creating. It is likely that the Yammer rollout is in support of other initiatives. So consider if the new Groups are topic/subject centric now, and how you might have to map threads into these topics. Many of the implementations we have been rolling out use a SharePoint site to aggregate the groups into a form of hierarchy or associations. This is a loose coupling though and a sensible approach, the association to hierarchy is only appropriate if the user scenario warrants it, such as a knowledge network.

Group metadata such as name, logo and description are also key for discovery by users within the Yammer Network. Consider also telling users which the primary language is for the group. Yammer has a pretty good translation mechanism using Bing Translate, but some groups naturally centre around a specific language.

Consider User membership for new Yammer Groups, do you want to add the users from the existing forums to the new Yammer Group? Sometimes this is a straight mapping one to one, sometimes if you re-jig the structure it becomes more complex. A simple algorithm is to capture all the users involved in the group threads and make sure they get added to the Group members. Again this in principal goes against what I’d normally be proposing, Users really shouldn’t be forcibly added to group it’s bad practice for good engagement and really annoys people.

Threads and Content

Threads and content form the cornerstone of the forum experience, in Yammer these map roughly to ‘Conversations’ and individual ‘Yams’. The Yammer paradigm is quite straight forward structure wise.

  • Network – The overall container for all the content
  • Groups – The segmentation of the content within the network into groupings of people who generate content
  • Conversation (aka a thread) – The conversations which are started and replied too
  • Yam (aka a message) – An individual post within the conversation thread.


Each thread message needs to be analysed for the following:

  • It’s parent message, so that it can be correctly associated into a new Yammer conversation
  • Any forum members mentioned in the content, so that the correct @mentions can be inserted into the new Yam
  • Any tagging used, so that the correct @mentions can be inserted into the new Yam (I know it sounds weird to say @ mentions again, but the user and tags are effectively the same with some meta data differences)
  • Any attachments being referenced by the message, so that the correct handling approach can be used (more about attachments in a minute)
  • Any Hrefs within the content, so that you can decide how to deal with the link (more about that in a few more lines)


Handling Hrefs can take a couple of formats:

  • Keep them as is and just ensure they map into the content specification of Yammer
  • Modify the url against a mapping rule set, creating a permanent redirection to other migrated content
  • Modify the url against a mapping service, creating a permanent redirection but to something that can manage the redirection as content moves around in supporting platforms

If you go for an external link redirection service consider how long this will be in place for. Consider the overall benefits of redirection and watch the analytics tracing to ensure that users are still getting benefits from the service.

I did consider using SharePoint query rules as a way to manage redirection. Imagine that you create key threads in Yammer, you can create the links within them to point to a SharePoint site, each site can have its own query rules for content promotion. So where you’re knowledge management community also has a SharePoint site collection you can implement a local collection of search schema search query rules. If you take this approach you can effectively ‘map’ a Href into a search based url with specific terms. Then it fires across to SharePoint and the relevant promoted results would be presented.


Dealing with attachments can take several forms:

  • Uploading the content into Yammer directly and effectively storing that content in Yammer
  • Making reference to external sources such as a web page where the content lives

Depending on your requirements may mean choosing the option most relevant almost on a message by message basis. My recommendation would be to store information outside of Yammer where it better aligns with collaboration systems such as Office365 sites, and upload directly where it makes more sense as social content. The obvious examples are a rule set which treats all Office document formats as upload to SharePoint and things such as images go into Yammer. You may also decide to treat some messages as ‘OpenGraph’ and have Yammer treat them as more like a notification than a message.

Thread sharing

Yammer allows a thread to be shared once created to another group. When sharing the thread sharer can choose to add additional text to the conversation share. Once shared the thread is effectively a new thread with its own content. So with this in mind you need to analyse the forum sharing paradigm in the current forums and decide how to map them into Yammer.

In situations where a thread appears across multiple forums. For example:

  • Original post goes into forum A, then is shared to forum B and C
  • When a user view either forum A,B or C the entire thread appears, thus capturing all the conversation once, but displaying it in multiple places

You need to consider how this maps into the Yammer sharing paradigm. I think the most sensible approach is to decide which Yammer Group the thread is most suited for, this is probably a manual decision in most cases. Once targeted for a specific group post the whole thread to that group, then share the complete thread into other groups. It might also be work retrospectively adding ‘tags’ to the thread.


Yammer has a tagging concept called ‘Topics’. Each thread can be tagged with topics to help discoverability. A user can elect to follow a specific Topic. So when migrating content analyse the content for potential topics and ensure they are tagged in the Yam with Topics. This is a value add activity and would require quite a lot of intelligence to identify topics in existing threads.


As you add new Yams to the Yammer threads you can’t influence the date stamp. So if you require date stamps to be known the only real solution is to append the information into the Yam text.


Always a thorny issue, the conflicting paradigms of information security and engagement and openness working like a network.

There is definitely no right answer about how to handle security, so these thoughts are where I’d propose you started.

Yammer’s paradigm really needs Yammer Groups to remain publically visible and readable to reap the benefits from working like a network. So you need to plan how to ensure information which might have been secure remains so. Often the existing forums are restricted security wise by ‘membership’ as a way to keep a tight control of who can edit, often driven by reporting and funding concerns about membership numbers rather than true data sensitivity. For this scenario in most cases adopting the public group paradigm is fine.

Where you need a private group, say for example the Board, then using a Private group is perfectly acceptable.

You will need to audit the information in every thread for its compliance if you want to be 100% sure that you’ve not shown restricted information from the existing forums in the new Yammer Groups.

User engagement and roll out

Although not strictly migration in the technical sense don’t under estimate how you need to deal with users moving them from the old world into the new. Thinking about how to help users find threads that have moved.

One possible solution to this is to add a new reply in the existing forums which contains information about the thread location within Yammer. This helps anyone who has the old thread bookmarked come away with a smile as they can navigate to the Yammer conversation and continue the discussions. It also allows for a time of dual running where some but not all threads have made it into Yammer.

Another aspect to consider is publishing the whole process for the end user to read. Sounds little bonkers at first, why should users care or be told about the internals of an IT project. Well if you think about what Yammer has as a central pillar… it’s openness. So help users get into the correct mind set.

Yammer Technical

App Registration

The processing app must be registered within Yammer. Instructions for the App Registration process can be found in the Yammer Developer Center:

Once the migration processing app is registered the Client Id and Client secret can be used by the app to make calls into the Yammer Network.


The processing app must authenticate to Yammer in order to make calls the API. Information about the authentication process and options can be found in the Yammer Developer Center:

OAuth Impersonation

To interact with the Yammer API on behalf of another user you will need to use a Verified Admin account for the processing app and have it call the relevant REST API endpoint to collect the user token for the user in question. Information about impersonation can be found in the Yammer Developer Center:


The Yammer API allows you to retrieve, create, update and suspend (soft delete) users. Information about the exact API call format can be found here:

Rate limits

The migration processing app must have logic to throttle the calls made to Yammer as to not exceed these thresholds. It should also monitor the responses and act accordingly to retry and ensure data integrity in the event of API calls being blocked. Information about the Yammer REST API rate limits can be found on the Yammer Developer Center:

The content at the time of writing is as follows:

API calls are subject to rate limiting. Exceeding any rate limits will result in all endpoints returning a status code of 429 (Too Many Requests). Rate limits are per user per app. There are four rate limits:

Autocomplete: 10 requests in 10 seconds.

Messages: 10 requests in 30 seconds.

Notifications: 10 requests in 30 seconds.

All Other Resources: 10 requests in 10 seconds.

These limits are independent e.g. in the same 30 seconds period, you could make 10 message calls and 10 notification calls


Also useful during the process is the availability of an Enterprise Yammer network, mainly so you can get the admin functions like data export and impersonation working. The question over having a discreet ‘dev’ network came up, my personal thoughts are as follows. If you’re not ‘live’ with Yammer while doing the migration work then why not simplify the moving parts by keeping just the main Network. At the end of the day you can always try things in a private group first to write your processing code, then open it up once the code is tested. Worst case you add messages you need to delete later. This keeps costs down. If however you’re already using Yammer, then best to go for multiple Networks. Again it’s your choice, but Yammer is a service that you can’t actually customise, so your options are safer in my opinion.

NB: This information is my pre-execution thinking and hopefully after the work is complete I’ll come back and make any amendments to the information.

CKSDev for Visual Studio 2012 version 1.2 Released

The 1.2 version has just been released. It contains loads more features you had available in VS2010. All of the other features will be coming over the coming weeks as personal time allows. As you can imagine it’s no mean feat to port all of the existing features and support multiple SharePoint versions. The code base is also on a diet as code bloat was getting a little bit crazy as the tools evolved over the past 4 years.
You can find CKS: Development Tools Edition on CodePlex at
Download the extension directly within VS2012 from the extension manager or visit the Visual Studio Gallery.
We’re still looking for extension ideas so please post in the CKS Dev Discussions anything you think would make life better for SharePoint developers.

CKS Dev Feature highlights

  • Copy Assembly Name – From the context menu of a SharePoint project you can now get the full assembly name copied to the clipboard.
  • Cancel adding features – Automatic cancellation of new SPIs being added to features. You can turn this off via the CKSDev settings options.
  • Find all project references – Find where a SPI/Feature is being used across the project from the project context menu.
  • Activate selected features – Setup which package features you want to auto-activate from the project context menu.
  • No new features
  • ASHX SPI template – Produces a full trust ASHX handler.
  • Basic Service Application template – Produces a full trust basic service application.
  • Banding SPI Template – Produces a full collection of SP2010 branding elements baseline items.
  • Contextual Web Part SPI template – Produces a contextual ribbon enabled web part.
  • WCF Service SPI template – Produces a full trust WCF Service endpoint.
  • Web template SPI template – Produces a SP2010 web template.
  • Improvements to Quick Deploy – Performance improvements with a switch from calling into GACUtil.exe and returning to direct GAC API calls to improve performance. Also removal of ‘custom tokenisation’ for now until a more performant version is tested.
  • Quick deploy GAC/Bin deployment configuration – A deployment configuration which runs the pre-deployment command line, recycles the application pool, copies the assemblies to either the BIN or GAC depending on their packaging configuration, and runs the post-deployment command line.
  • Quick deploy Files deployment configuration – A deployment configuration which runs the pre-deployment command line, recycles the application pool, copies the SharePoint artefacts to the SharePoint Root, and runs the post-deployment command line.
  • Quick deploy all deployment configuration – A deployment configuration which runs the pre-deployment command line, recycles the application pool, copies the assemblies to either the BIN or GAC depending on their packaging configuration, copies the SharePoint artefacts to the SharePoint Root, and runs the post-deployment command line.
  • Upgrade deployment configuration – A deployment configuration which runs the pre-deployment command line, recycles the application pool, upgrades the previous version of the solution, and runs the post-deployment command line.
  • Attach To IIS Worker Processes Step – Which attaches the debugger to the IIS Worker process during deployment.
  • Attach To OWS Timer Process Step – Which attaches the debugger to the OWS Timer process during deployment.
  • Attach To SPUC Worker Process Step – Which attaches the debugger to the User Code process during deployment.
  • Attach To VSSPHost4 Process Step – Which attaches the debugger to the Visual Studio deployment process during deployment.
  • Copy Binaries Step – Copies the assemblies during deployment to either Bin or GAC.
  • Copy To SharePoint Root Step – Copies the files during deployment to the SharePoint Root.
  • Install App Bin Content Step – Copies the files to the App Bin folder during deployment.
  • Install Features Step – Installs the features during deployment.
  • Recreate Site Step – Recreates the deployment targeted site during deployment.
  • Restart IIS Step – Restarts IIS during deployment.
  • Restart OWS Timer Service Step – Restarts The SharePoint Timer service during deployment.
  • Upgrade Solution Step – Upgrades the solution during deployment.
  • Warm Up Site Step – Calls the site to get SharePoint to warm it up during deployment.
To see more details and the full feature list visit

Visit the CodePlex site for more information about the features and release.

Share and Enjoy and thanks for the continued support

CKSDev for Visual Studio 2012 version 1.0 Released


Since 2009 CKSDev has been available to SharePoint 2010 developers to aid the creation of SharePoint solutions. The project extended the Visual Studio 2010 SharePoint project system with advanced templates and tools. Using these extensions you were able to find relevant information from your SharePoint environments without leaving Visual Studio. You experienced greater productivity while developing SharePoint components and had greater deployment capabilities on your local SharePoint installation.

The Visual Studio 2012 release of the MS SharePoint tooling supports both SharePoint 2010 and 2013. Our aim for CKSDev was to follow this approach and provide one tool to support both SharePoint versions from within Visual Studio 2012.

The VS2012 version has just been released. It contains the core quick deployment and server explorer features you had available in VS2010. All of the other features will be coming over the coming weeks as personal time allows. As you can imagine it’s no mean feat to port all of the existing features and support multiple SharePoint versions. The code base is also on a diet as code bloat was getting a little bit crazy as the tools evolved over the past 4 years.

You can find CKS: Development Tools Edition on CodePlex at

Download the extension directly within VS2012 from the extension manager or visit the Visual Studio Gallery.

We’re still looking for extension ideas so please post in the CKS Dev Discussions anything you think would make life better for SharePoint developers.

CKS Dev Feature highlights

  • Copy assembly name so no more need to use reflector to get the full strong name.
  • New Server Explorer nodes to list feature dependancies, site columns, web part gallery, style library, theme gallery.
  • Enhanced import functionality for web parts, content types, themes.
  • Create page layout from content type (publishing).
  • Copy Id.
  • Coming Soon…
  • Quick deploy to copy artefacts into the SharePoint root without the need for a build.
  • Deployment steps Coming Soon..

To see more details and the full feature list visit

Visit the CodePlex site for more information about the features and release.

Share and Enjoy and thanks for the continued support

SUGUK 1st March 2012


So last night saw the return of the London SUGUK.

Matt Taylor has stepped down as co-coordinator for London and I want to thank Matt for all his efforts and establishing such a great user group.

So Steve Smith started proceedings with some ITPro and End User snippets. Starting with a small 3 slide long PowerPoint he dived straight into some demos.

The demos began with some SMTP love, must admit most of this talk of AD etc went a little over my head so frantic note taking was the order of the day. There were some neat tips about delegating administration via a customised MMC console and how to get around some production system challenges regarding SharePoint self-managing AD stuff.

Next demo came down to SQL Server databases and the fact that SP doesn’t always do the best defaults. I learnt some nice things about pre-creating a SQL database then using Central Admin to ensure that defaults were better suited to a production scale. Also the points about database growth were very interesting and I could see how these could be quickly leveraged to improve performance.

Finally was a quick run through of some Office functionality connecting search into the Office application and how to configure ‘save as’ links from the MySite UPA.

Then a short break in proceedings gave me just enough time to connect up my laptop and hope it played nice with the projector…. something about this W520 hates projectors….

Breathing a sigh of relief I launched into a Dev centric topic of Extending the Activity Feed within SP2010. The demo source code can be downloaded from code download and the slides can be seen below.

Thanks again for Steve for inviting me to present as it’s always a pleasure dusting off Visual Studio and sharing something cool with the SUGUK. Open-mouthed smile

Speaking at the London SharePoint User Group


The London SharePoint user group kicks off 2012 with Steve Smith and I presenting.

Session 1


SharePoint Administration – I always wondered what that was for.

By Steve Smith – Combined Knowledge

In this session Steve is going to help everyone understand some of the quirks and options that we see in SharePoint and what they do/break and how we then go about building the Infrastructure for it. An ideal session for SharePoint Admins and Developers alike. Plus he will throw in some Power user stuff to give everyone something to take away.

Session 2


Extending the SharePoint 2010 activity system

By Wes Hackett – Content and Code

Amongst the most anticipated new features of SharePoint 2010 were the social activity feed features which bring colleague activity as a feed to an individual. Natively the activity feed displays user profile changes, tagging and notes activity. Microsoft provides an API to extend the activity feed system with your own content. With this extensibility API it is possible to extend this to include custom activities. In this session we’ll explore the native system and the elements needed to extend it.

When and where?

March 1st

Start Time: 6PM
Finish: 9PM

After the event in a local watering hole.

Unfortunately there may not be food or beverage available for the meeting so please bring any drinks with you just in case.

Cavendish Conference Centre
22 Duchess Mews
London W1G 9DT

Sign up…

See you there….Open-mouthed smile

SharePoint Saturday 2011


Saturday 12th November saw the 2011 SharePoint Saturday at Nottingham’s Conference Centre. Another great event organised with so many quality speakers and companies in attendance.

I presented a session on Social Intranet and the slides can be seen below. I hope everyone found the session useful Smile I certainly enjoyed presenting to such an interactive audience.

Just goes to thank everyone involved in the organisation and planning for the event, it was awesome again SmileSmile

Ratings and SharePoint Search better together


SharePoint 2010 introduces a new feature to allow a user to rate content within the sites. Depicted by five stars the user can rate content and these ratings are collated to provide the average rating for the item. Displaying content based on ratings can assist users determining the quality of content easily. Ratings will also help content authors understand which content is considered to be higher quality by the readers.

The native rating user interface is a collection of five stars. After a user selects the desired rating it is submitted and averaged with the other ratings for the content. The ratings are processed by a timer job process so there is some small delay.

By default ratings can be configured on lists and libraries and also added to page layouts. They also get fed into the Activity Feed system as rating activities. These add some great user focused features, but in my opinion there is a missing piece.


So I got thinking having seen some articles about additional search managed properties. Seeing as the average rating and rating count are both columns added to the list/library I decided to investigate whether it was possible to index and therefore present the rating within the search results.

So here the journey of rating discovery begins…

Adding ratings to lists or libraries


Before we can investigate getting ratings into the search experience we need to get rating up and running on some content. So lets go ahead and create a document library as our demo content. Upload numerous documents into the library (at least five) so that we have some content to rate.

So that’s the content ready for rating so what next?

  • Browse to the library.
  • From the ‘Library’ ribbon click the ‘Library settings’ button.
  • Under the general heading click the ‘Rating settings’ link.
  • Under the ‘Allow items in this list to be rated’ click yes and ‘Ok’.

This adds two columns to the library. The ‘Rating (0-5)’ and ‘Number of Ratings’. By default the Rating column is added to the view. You can also add the number column to the view if you want (more of that idea later). It should also be noted that these columns will be added to any bound content types as well.

That’s prepared the content for rating so now go ahead and click some ratings. For my demo I logged in as several users and rated the documents with ratings to demonstrate several of the number of stars.



Timer job


As mentioned earlier the ratings are calculated via a timer job called {user profile SA name} – Social Rating Synchronization Job. This job aggregates the ratings. To speed up the development you can manually execute the timer job to cause the rating aggregation.



Search Managed Properties


So with the native rating functionality function configured and ready for indexing it’s time to swing over to the farm Search Service Application to perform the steps needed to index our ratings.

As with any element within SharePoint content for it to get indexed it needs to be a managed property. By default most common content fields are already configured. Ratings however are not setup in this way. So the first step is to create these managed properties for the ‘Rating (0-5)’ and ‘Number of Ratings’ columns.


AverageRating Managed Property


To create the average rating property map the ‘ows_AverageRating(Decimal)’ as shown in the screenshot below.




Once the properties are configured a index of the content source is required. Browse to the ‘content sources’ and start a crawl. While this is whizzing along in the background the modifications to the search results web part can be made.


Search Results web part


The standard Search Results web part provides the view of the content found matching the query term.


As you can see there are no ratings information displayed. So we’re going to modify the results web part to include the rating and rating count below the content description.


Adding the columns to fetched data


To be able to show the AverageRating and RatingCount results they need to be added to the ‘Fetched Properties"’ xml within the web part settings (the circled element in the screen shot below)


Modify the xml to add the new columns to it. The example below lists the new columns last, you can copy this or append your existing list.

<Column Name="WorkId"/>
<Column Name="Rank"/>
<Column Name="Title"/>
<Column Name="Author"/>
<Column Name="Size"/>
<Column Name="Path"/>
<Column Name="Description"/>
<Column Name="Write"/>
<Column Name="SiteName"/>
<Column Name="CollapsingStatus"/>
<Column Name="HitHighlightedSummary"/>
<Column Name="HitHighlightedProperties"/>
<Column Name="ContentClass"/>
<Column Name="IsDocument"/>
<Column Name="PictureThumbnailURL"/>
<Column Name="PopularSocialTags"/>
<Column Name="PictureWidth"/>
<Column Name="PictureHeight"/>
<Column Name="DatePictureTaken"/>
<Column Name="ServerRedirectedURL"/>
<Column Name="AverageRating"/>
<Column Name="RatingCount"/>

Apply the changes to the web part. Now the data is coming back within the search result set.


Modifying the XSLT


Next step is to get these properties displaying.


I’m not that skilled in front end coding so this will demo the concept and I’m sure those more creative design peeps will add their own flare to the visuals Winking smile

So I’m choosing to inject the rating and number of raters just after the title and description. Therefore locate the ‘<div class="srch-Metadata2">’ div to inject the new code. Below is a snippet from the section and includes the calls to the new templates.

<div class="srch-Metadata2">

<xsl:call-template name="stars">
<xsl:with-param name="starCount" select="averagerating"/>
<xsl:call-template name="ratingcount">
<xsl:with-param name="ratingCount" select="ratingcount"/>

<xsl:call-template name="DisplayAuthors">
<xsl:with-param name="author" select="author" />

As you can see I’ve introduced two new templates, one for each property.

Before we dive into the templates there is something important to share. To improve performance most visual images displayed in css sprite format. This means that css class positions the images to display the required section from a large map of images. The rating control is no different and uses the sprite image found ‘/_layouts/Images/Ratings.png’




So the template to render the stars needs to make use of the same native css classes.

The following is the star rating template. It contains the logic to read the rating value and generate the relevant css positioned rating image.

<!– The Stars displaying –>
<xsl:template name="stars">
<xsl:param name="starCount"/>

<span class="ms-currentRating">
<!– Set the correct css sprite for the number of stars –>
<xsl:when test="$starCount &gt;= 4.5" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 5 stars.</xsl:text>
<img class="ms-rating_5" alt="Current average rating is 5 stars." src="/_layouts/Images/Ratings.png" />
<xsl:when test="$starCount &gt;= 4.5 and $starCount &lt; 5" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 4.5 stars.</xsl:text>
<img class="ms-rating_4_5" alt="Current average rating is 4.5 stars." src="/_layouts/Images/Ratings.png" />
<xsl:when test="$starCount &gt;= 4 and $starCount &lt; 4.5" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 4 stars.</xsl:text>
<img class="ms-rating_4" alt="Current average rating is 4 stars." src="/_layouts/Images/Ratings.png" />
<xsl:when test="$starCount &gt;= 3.5 and $starCount &lt; 4" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 3.5 stars.</xsl:text>
<img class="ms-rating_3_5" alt="Current average rating is 3.5 stars." src="/_layouts/Images/Ratings.png" />
<xsl:when test="$starCount &gt;= 3 and $starCount &lt; 3.5" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 3 stars.</xsl:text>
<img class="ms-rating_3" alt="Current average rating is 3 stars." src="/_layouts/Images/Ratings.png" />
<xsl:when test="$starCount &gt;= 2.5 and $starCount &lt; 3" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 2.5 stars.</xsl:text>
<img class="ms-rating_2_5" alt="Current average rating is 2.5 stars." src="/_layouts/Images/Ratings.png" />
<xsl:when test="$starCount &gt;= 2 and $starCount &lt; 2.5" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 2 stars.</xsl:text>
<img class="ms-rating_2" alt="Current average rating is 2 stars." src="/_layouts/Images/Ratings.png" />
<xsl:when test="$starCount &gt;= 1.5 and $starCount &lt; 2" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 1.5 stars.</xsl:text>
<img class="ms-rating_1_5" alt="Current average rating is 1.5 stars." src="/_layouts/Images/Ratings.png" />
<xsl:when test="$starCount &gt;= 1 and $starCount &lt; 1.5" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 1 star.</xsl:text>
<img class="ms-rating_1" alt="Current average rating is 1 star." src="/_layouts/Images/Ratings.png" />
<xsl:when test="$starCount &gt;= 0.5 and $starCount &lt; 1" >
<xsl:attribute name="title">
<xsl:text>Current average rating is 0.5 stars.</xsl:text>
<img class="ms-rating_0_5" alt="Current average rating is 0.5 stars." src="/_layouts/Images/Ratings.png" />
<xsl:attribute name="title">
<xsl:text>Current average rating is 0 stars.</xsl:text>
<img class="ms-rating_0" alt="Current average rating is 0 stars." src="/_layouts/Images/Ratings.png" />



As well as the rating we’re adding the number or people who have rated. This gives the consuming user a decent idea of the level of interest the item has had. The following is the rating count template.

<!– The Rating Count displaying –>
<xsl:template name="ratingcount">
<xsl:param name="ratingCount"/>

<xsl:when test="$ratingCount = 1" >
<xsl:text>Rated by 1 person.</xsl:text>
<xsl:when test="$ratingCount &gt;= 1" >
<xsl:text>Rated by </xsl:text>
<xsl:value-of select="$ratingCount" />
<xsl:text> people.</xsl:text>
<xsl:text>Not rated.</xsl:text>



With all these changes applied to the web part settings, save and publish the page and you should find the results now look something like the following screenshot.


Ok so that is pretty cool, users can now see content and peoples ratings of them to assist in finding the right things. It’s not quite the whole story on all the cool things to display, refiners are the other.


Refining by Ratings


With the ratings being displayed in the results it got me thinking that having a refiner for the rating value and number of people who had rated.

So the native refinement web part allows the customisation of the refiners via its settings. One important point to note is that the web part settings will have no effect unless the following check box is unchecked.


The Rating refiner is below.

<Category Title="Rating"
          Description="The average rating for the item." 
          MoreLinkText="show more"
          LessLinkText="show fewer"/>

The Number of Ratings refiner is below.

<Category Title="Number of ratings"
          Description="The number of ratings from people for the item."
          MoreLinkText="show more"
          LessLinkText="show fewer"/>

Add these to the XML property of the web part.


Save the page and you end up with the refiners for ratings and number of ratings available.


Going the extra step with the refiners

So the initial refiners display any found values. A nice proof of concept would be to add some filter groups to group rating values together into Bronze, Silver, Gold instead of 0-5. That is possible with the refiner property grouping using ranges.


Hopefully this no-code solution adds some extra sweetness to the use of ratings and search in equal measure.

A status by any other name would still smell as sweet

While investigating status updates from within the Activity Feed it was necessary to understand the native social status update logic. The aim was to allow commenting a-la Facebook against any of the activity feed events in the feed. This article presents the information learnt about the native status update logic.
An example Facebook comment:
Fundamentally this could be achieved by adding a new ‘Social Comment’ against that item, I suspect there will be some challenges with this approach like the fact it would generate ‘Note Board’ activity events by default. Before diving into that I thought it would be a sound idea to understand more about where the data comes from for each activity type. This article covers the Profile Status.
In my previous post about the Activity Templates you can find the template specific for the ‘Status update’
{Publisher} says "{Value}".
This is used to display the Status update to your colleagues.
So first up I wanted to understand what happens when you type a new status.
The control which is used is the ‘Microsoft.SharePoint.Portal.WebControls.StatusNotesControl’. This control renders the required input box and references for the client side script to make this pretty seamless for the end user. Under the bonnet it sets a property called ‘SPS-StatusNotes’ against the current user’s profile.
Looking at the profile property several take-away information nuggets are worth noting.
  • The default max length for a status update is 512 characters
  • The privacy setting is ‘Everyone’
  • The ‘show in the profile properties section of the users profile page’ is unchecked. (If checked the value would have appeared below ‘Development’ in the screenshot above)
  • The property is not configured to be indexed by Search. This could be useful to enable if you wanted to show this value within the results page or build a component to return these results using search.
Once committed the ‘People’ search incremental crawl captures this change marker and the next time the Activity Feed timer job executes it will generate the status update event.

Office 365 User Group–26th September 2011

The launch event for the UK Office 365 user group. Hosted down at Microsoft’s Victoria offices the first event saw a collection of MS staff, Partners and a small number of Office365 customers arrive to kick of this new community.
You can follow the user group on Twitter on the #o365uguk or #o365ukug hash tag. I’m sure one was the proposed tag but both sprang to life.
Arno Nel (MVP) is the group coordinator

Session One

Session one kicked of with Steve Green from Microsoft introducing the user group. I’ve had the pleasure of working with Steve with one of my enterprise clients discussing SharePoint 2010 cloud offerings so I was looking forward to his presentation.
Steve’s session covered the transition from BPOS-S and Office365. We learnt that about 55,000 seats globally are on BPOS-S and Microsoft plan to migrate these to Office365 by September 2012. One interesting nugget was that Microsoft will only discuss Office365 dedicated usage at around 30,000 seat level.
The customer is notified 14 days prior to their tenant being migrated and have the option of a postponement of up to 30 days. Once notified the tenant is then in a situation where parts are being replicated and migration syncing is in progress, effectively the point of no return.
Although widely publicised it was mentioned that Office 2003 and IE6 are not supported or the OCS client connecting to the Lync servers. To be fair no-one in the front end teams will miss IE6 Winking smile.
During the transition Microsoft are responsible for the following:
  • Informing the customer, although there are some challenges with the email communications here where unattended mailboxes this may go missing. MS are in the process of resolving this problem.
  • Scheduling the transitions.
  • Providing information and guidance.
  • Providing an uninterrupted mail service, this is something that may prompt for close and reopen the mail client but the actual mails will be uninterrupted.
  • Actually migrating all the data, this is a big plus as it doesn’t require engaging an external provider to move from BPOS to Office365.
During the transition the customer is responsible for the following:
  • Providing the end user training and communications. An important point to note is that all the data that the organisation wants migrated should be already in SharePoint. This is again the responsibility of the customer.
  • Updates required on the client hardware or software.
  • Configuration of domain DNS settings for Outlook and Lync.
  • Optional – Deploying ADFS role, Exchange Server 2010 CAS Role
Steve’s advice was to run the pre-deployment readiness tool Smile
One other nugget of information I found useful that following a transition all the sites would be running in v3 UI mode. This means the UI is in 2007 mode and there are opportunities here for v4 implementations to bring in the latest UI features.

Session Two

Session two was presented by Andy Hutchins from Avanade.
And covered a collection of things to consider before you start:
  • The business case is not always obvious.
    • Sometimes IT is not the organisations core business.
    • They don’t always have a corporate roadmap which is aligned with cloud strategies.
    • It can be hard to put a value on quantifiable benefits.
  • Not everybody loves the cloud
    • IT are normally the buyers and they can have special views about cloud implementations. Worrying about internal skills, considerations about job security.
    • Lawyers will assess the service agreement with a fine tooth comb.
  • Building the coalition
    • Get the IT group on board.
    • Get Information Security involved.
    • Get the Internal Communications team involved so they can help the launch.
  • Are the organisations service partners ready?
    • Can all the existing service providers work in this new world?
Once you have approval for an Office 365 implementation what to consider next?
  • Identity, Identity, Identity.
  • Find the right robust champion inside the organisation.
  • Remember all the things that normally happen on a project.
  • So what is different?
    • The tools used.
    • Customer responsibilities.

Session Three

Session three was presented by Any Clay of 21Apps fame. Ant’s slides are below but here are some of the key points:
  • What is a Hybrid Organisation
    • A collection of Microsoft Research from 2010.
    • About the people, technology and workplace environment.
  • Why is it relevant?
    • Work is still based on a factory model mentality.
    • Knowledge work doesn’t follow any really well defined global process.
    • Economic pressures.
    • Cultural shifts and how the company gets business.
    • Physical constraints.
    • Innovations like establishing the best teams and attracting the best people.
  • Where does Office 365 fit in?
    • The cloud when it went offline had no-one fighting fires internally, therefore a cost saving internally.
    • Focus on your business not your servers, network infrastructure.
    • No-one really cares about the tin….. well outside the Server guys…
    • Reducing training needs and costs.
    • Don’t forget about outside the head office.
    • Allows different companies to locate in the same physical location without the complexity of hardware.
    • Can be stood up and functioning very quickly.


I got a lot out of the sessions. It’s great to see such investment in Office 365 and helping partners, ISV’s get information about the offerings and how to leverage them.

Activity Feed Item Templates


The SharePoint Activity Feed rendering is controlled by templates. These templates are used within the ActivityTemplate class which is responsible for controlling the rendering.

The OOB templates are held within a resources file located:

C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\Resources\osrvcore.resx

This contains the following templates:

Keyname: “ActivityFeed_ProfilePropertyChange_SV_Template”
Template: {Publisher} updated profile.&lt;br/&gt;{Name}: {Value}

Keyname: “ActivityFeed_ProfilePropertyChange_MV_Template”
Template: {Publisher} updated profile.&lt;br/&gt;{List}

Keyname: “ActivityFeed_Birthday_Reminder_SV_Template”
Template: {Publisher} is celebrating a birthday on {Name}.

Keyname: “ActivityFeed_Birthday_Today_SV_Template”
Template: {Publisher} is celebrating a birthday today.&lt;br/&gt;Wish {Publisher} a happy birthday!

Keyname: “ActivityFeed_WorkplaceAnniversary_Reminder_SV_Template”
Template: {Publisher} is celebrating a {Value} year workplace anniversary on {Name}.

Keyname: “ActivityFeed_WorkplaceAnniversary_Today_SV_Template”
Template: {Publisher} is celebrating a {Value} year workplace anniversary today.&lt;br/&gt;Wish {Publisher} a happy anniversary!

Keyname: “ActivityFeed_ColleagueAddition_SV_Template”
Template: {Publisher} added a new colleague.&lt;br/&gt;{Link}

Keyname: “ActivityFeed_ColleagueAddition_MV_Template”
Template: {Publisher} added {Size} new colleagues.&lt;br/&gt;{List}

Keyname: “ActivityFeed_TitleChange_SV_Template”
Template: {Publisher} has a new job title.&lt;br/&gt;{Value}

Keyname: “ActivityFeed_ManagerChange_SV_Template”
Template: {Publisher} has a new manager. &lt;br/&gt; {Link}

Keyname: “ActivityFeed_BlogUpdate_SV_Template”
Template: {Publisher} published a new blog post.&lt;br/&gt;{Link}

Keyname: “ActivityFeed_DLMembershipChange_SV_Template”
Template: {Publisher} has a new membership. &lt;br/&gt; {Link}

Keyname: “ActivityFeed_DLMembershipChange_MV_Template”
Template: {Publisher} has {Size} new memberships. &lt;br/&gt; {List}

Keyname: “ActivityFeed_SocialTaggingByColleague_SV_Template”
Template: {Publisher} tagged {Link} with {Link2}.

Keyname: “ActivityFeed_SocialTaggingByColleague_MV_Template”
Template: {Publisher} tagged {Link}.&lt;br/&gt;{List}

Keyname: “ActivityFeed_NoteboardPosts_SV_Template”
Template: {Publisher} posted a note on {Link}.&lt;br/&gt;{Value}

Keyname: “ActivityFeed_SocialTaggingByAnyone_SV_Template”
Template: {Publisher} tagged {Link} with your interest.&lt;br/&gt;{Link2}

Keyname: “ActivityFeed_SocialRatings_SV_Template”
Template: {Publisher} rated {Link} as {Value} of {Name}.

Keyname: “ActivityFeed_SharingInterest_SV_Template”
Template: {Publisher} shares an interest with you. &lt;br/&gt; {Value}

Keyname: “ActivityFeed_SharingInterest_MV_Template”
Template: {Publisher} shares {Size} interests with you. &lt;br/&gt; {List}

This contains the following template names:

Keyname: “ActivityFeed_ChangeMarker_SV_Template”
Template: Previous Gatherer Run

Keyname: “ActivityFeed_Status_Message_SV_Template”
Template: {Publisher} says “{Value}”.

Keyname: “ActivityFeed_ProfilePropertyChange_Type_Display”
Template: Profile update

Keyname: “ActivityFeed_Birthday_Reminder_Type_Display”
Template: Upcoming birthday

Keyname: “ActivityFeed_Birthday_Today_Type_Display”
Template: Birthday

Keyname: “ActivityFeed_WorkplaceAnniversary_Reminder_Type_Display”
Template: Upcoming workplace anniversary

Keyname: “ActivityFeed_WorkplaceAnniversary_Today_Type_Display”
Template: Workplace anniversary

Keyname: “ActivityFeed_ColleagueAddition_Type_Display”
Template: New colleague

Keyname: “ActivityFeed_TitleChange_Type_Display”
Template: Job title change

Keyname: “ActivityFeed_ManagerChange_Type_Display”
Template: Manager change

Keyname: “ActivityFeed_BlogUpdate_Type_Display”
Template: New blog post

Keyname: “ActivityFeed_DLMembershipChange_Type_Display”
Template: New membership

Keyname: “ActivityFeed_SocialTaggingByColleague_Type_Display”
Template: Tagging by my colleague

Keyname: “ActivityFeed_NoteboardPosts_Type_Display”
Template: Note Board post

Keyname: “ActivityFeed_SocialTaggingByAnyone_Type_Display”
Template: Tagging with my interests

Keyname: “ActivityFeed_SharingInterest_Type_Display”
Template: Sharing Interests

Keyname: “ActivityFeed_ChangeMarker_Type_Display”
Template: Gatherer Change Marker

Keyname: “ActivityFeed_SocialRatings_Type_Display”
Template: Rating

Keyname: “ActivityFeed_Status_Message_Type_Display”
Template: Status Message

The curly brackets are then replaced by the UI web part rendering code.

Hopefully this sheds a little bit more light on the internals of the activity feed internals.