Adding GitHub to the App Launcher


Lets start with a little background

One of the latest features released to Office 365 and Azure was the ‘App Launcher’. This feature (Microsoft Announcement) provided a consistent menu of applications that can be launched by the user. Azure Active Directory now provides an easy way to integrate to many SaaS platforms. It provides identity and access management features through the Azure portal and the Access Panel for users to discover apps they have access too. The App Launcher leverages the same underpinnings within Azure to provide the suite wide UX within Office 365.

Azure Access Panel

Information about setting up Application Access in Azure Active Directory can be found here: Another feature we won’t go through but is worth mentioning is the ‘Change Password’ feature on the profile tab.

This is a screen shot of my tenant Access Panel. You can browse to yours using:


The Access Panel can serve several different types of application:

  • Office 365 applications – If you are using Office 365 such as Exchange and SharePoint and the logged in user is assigned a license then these will appear. The user will be automatically signed in when they click any of the Office 365 apps.
  • Microsoft or Third Party apps configured with Federation based SSO – If an Azure admin has configured the app with single sign-on mode set to ‘Azure AD Single Sign-On’ then when a user clicks the app they will be automatically logged in assuming they have been explicitly granted access to that application.
  • Password based SSO without identity provisioning – These are applications the Azure admin has added with the single sign-on mode set to ‘Password based Single Sign-on’. It is important to realise that all users authenticated to the Azure AD will see these applications. The first time a user clicks one of these apps they will be asked to install a lightweight browser plugin for IE or Chrome. Once they restart the browser the next time they navigate to that app they will be asked to enter the username and password combination for that app. This is then securely stored in Azure AD and linked to their organisation account. The next time the user clicks that app they will be automatically signed in with the credentials they provided. Updating credentials in the third party app needs the user to update their Azure AD stored credentials from the context menu on the app tile.
  • Password based SSO with identity provisioning – These are applications the Azure admin has added with the single sign-on mode set to ‘Password based Single Sign-on’ as well as identity provisioning. The first time a user clicks one of these apps they will be asked to install a lightweight browser plugin for IE or Chrome. Once they restart the browser the next time they will be automatically signed in to the application.
  • Application with existing SSO solutions – These applications are configured with the sign-on mode set to ‘Existing Single Sign-on’. This options supports the existing methods of SSO such as ADFS 2.0 or whatever the third party application is using.

Full details about the Access Panel can be found here:

App Launcher

The App Launcher is the name for the UX within the Office 365 suite. The screen shot below shows the fly out menu active on my tenant. You can see all the apps that this user is assigned licenses for are visible, also admin as this user is a tenant admin.


You’ll also see the ‘My Apps’ option in the bottom right corner. This takes you to a fully immersive experience listing all your apps. As you can see from the screen shot below.


This page lists all the applications from Azure AD applications as well as anything you have installed within your OneDrive for Business site on SharePoint online.

Configuring GitHub through the App Launcher

So we’ve taken a whistle stop tour around the Azure AD Access Panel and App Launcher lets now look at how to add an application to it. For this article we’re going to look at providing our users SSO for GitHub. The Azure AD links above show how to connect up to all sorts like SalesForce, DropBox etc, but Microsoft’s latest code repository choice isn’t listed. As all the  Office Dev Code Samples these days live in GitHub it makes sense to provide a SSO implementation for your dev teams. Here’s how.

First thing to do is log into the Azure portal. You’ll see the connected Azure Active Directories listed. You might have several or just your Office 365 directory. You pick the one you want the application to show up in. In my example I’ll pick my main tenant.


When you click the required AD row it will switch into the dashboard for that AD service. As you can see by the screenshot below there are lots of different things you could do here, but we are going to focus on the ‘Applications’ tab only.


Clicking the ‘Applications’ tab shows the connected applications. In the screen shot you can see I’ve been busy with the Office 365 APIs Smile. Also note that this AD is connected to my Office 365 subscriptions so both Exchange and SharePoint are listed. These don’t have the same degree of settings available as other applications though.


So to add a new application click the ‘Add’ from the menu bar. This pops a light box as you can see below. There are two options, first is to add a custom application (a topic for a further article) which you are developing, the second to connect a service from the gallery. At the time of writing there are about 4500 services and applications available in the gallery so it’s worth having a peek through. GitHub is an existing service so we need to click ‘Add an application from the gallery’.


Rather than browse it will be easier to type ‘GitHub’ in the search box. You’ll see the below. So click the ‘tick’ button to confirm.


Now GitHub is connected to your Azure AD as an application. We now need to configure the SSO settings and assign some users.


Click the ‘Configure single sign-on’ button to setup the SSO for GitHub. The light box that pops up has two options, first is the Password Single Sign-on, the second is for existing Single Sign-on. Both are explained in more detail above. We are going to choose the ‘Password Single Sign-on’ to connect as we don’t already have anything else configured for SSO with GitHub. Click the ‘tick’ to confirm.


We have now configured our chosen method of SSO. It’s time to assign some users. So click the ‘Users’ tab. From here all the users in your AD are going to be listed so you probably want to search using the slightly hidden search feature on the table header far right to narrow down the view to users you want.


Once you have your desired user select them by clicking the row. And then choose ‘Assign’ from the menu bar.


The light box that pops up allows us to confirm that user is about to be assigned access via SSO to this application. The checkbox feature we’ll come back to later in the article, for now leave it unchecked. Click the ‘Tick’ to confirm.


So there we have it, in some fairly simple steps we have configured SSO with GitHub via our Azure Active Directory. Lets now take a look at the implications for the end user experience in both the Access Panel and App Launcher.

Access Panel user experience

Now GitHub will show up for the assigned user. In the screen shot you can see the new GitHub tile has appeared. It can sometimes take a few minutes to update and the page may display a refresh message when changes have happened that need to reload.


As mentioned earlier a user can maintain their stored credentials via the Access Panel. As you can see from the screen shot this option is available from the tile on the Access Panel.


Clicking for the GitHub App very first time from the Access Panel invokes the browser plugin installer as you can see from the screen shot below.


In this example I was using Chrome, so here are the pop ups which trigger the install.


Confirm the installation dialog.


Next time you click the GitHub App you will be asked to enter your credentials as Azure AD does not yet have any stored. Enter the desired credentials and click ‘Sign In’.


Now when you click it the Azure SSO will kick in via the browser extension and log you in with the stored credential. Blink and you’ll miss it though, took me five attempts to screen grab the login step.


And there you have it, signed in to GitHub with the SSO password.


App Launcher user experience

The Office 365 App Launcher MyApps page now sports the same GitHub icon under ‘My Apps’.


Clicking for the GitHub App very first time from the My Apps page invokes the browser plugin installer as you can see from the screen shot below.


The next time you click the GitHub App the same SSO process as above is invoked and you get signed in.

One feature of the App Launcher which the Access Panel can’t do is allow the user to pin the App to the flyout menu. To do this navigate to the ‘My Apps’ page and from the context menu of the app click ‘Pin to app launcher’ as you can see in the screen shot below.


As you can see this then pins that app to your App Launcher menu.


Other stuff worthy of a mention

App Launcher where a user has no App assignment

Below is a screen shot of a different user within the same tenant and Azure AD who doesn’t have GitHub assigned as an App. As you can see their ‘My Apps’ page doesn’t list it.


Assigning a credential on behalf of a user in the Azure Portal

We mentioned the checkbox earlier. If you wanted to set the username and password during assignment check the checkbox and you get the option to enter the credentials on behalf of the user.


So why is this important? Well consider situations where you don’t want a user knowing or setting the credential. For example a situation where the organisation has a marketing twitter account. You can now provide SSO for the marketing team by setting up their credential on their behalf. They can still obviously change it in Twitter but it removes the need to email everyone the password.

Removing a user app assignment

Removing the user assignment is as easy as selecting them and clicking ‘Remove’ from the menu bar.


App dashboard

Another thing work mentioning is the App dashboard. Here you can see the login activity and some basic information about the app. What is really useful though is the Single Sign-on url. This is a unique url for this SSO’d app and pasting it in effectly jumps the Access Panel or App Launcher steps and navigates directly through the sign-on process. This would be useful if you are considering email or Yammer posts with links directly to the application.



Hopefully you’ve found this useful Smile and seen how easy it is to take advantage of the SSO features to improve your user experience.

So we now have GitHub easily available to all the assigned users, probably starting with the dev team.

SP Connect 2014 Presentation


Tuesday 18th and Wednesday 19th saw this years SP Connect 2014 take place at the Meervaart Theatre, Amsterdam. It was a great event organised with so many quality speakers and companies in attendance. It was a privilege to be invited to speak Smile

I presented a session on the new Office 365 APIs with Chris O’Brien, the slides from our session can be seen below. I hope everyone found the session useful Smile I certainly enjoyed presenting to such an interactive audience.

Our session demonstrated the latest release of the Office 365 APIs which recently GA’d. We used the samples available on GitHub, The Web Client library used the MVC5 starter example. This is a single tenancy app which uses the three elements of the Outlook Client library and the Auth models. It’s a great place to start as it shows a good spread of the API. The second demo showed the preview File Handler which shows how to extend Office 365 with a file extension capability. The pictures below give a sneak peek to how it looks when its working. The sample can be found here




Thanks to everyone who attended the session, hopefully I’ll be back at next years event. Special thanks to Daniel Laskewitz for allowing me to use his session picture in this article Smile

Document conversations does not equal hover panel post


Back in June 2014 Microsoft/Yammer announced the arrival of a new feature call ‘Document Conversations’. Available at the time of writing in some tenants (full roll out is in progress) this feature adds a fly out panel to Document Libraries in Office365. The full details of the feature can be seen on the Office Blog here:

Our Office365 tenant is setup with ‘First Release’ which provides upcoming updates about two weeks prior to their formal rollout. Over the last month we’ve seen the Document Conversations feature coming and going. Hopefully it will be fully completed at some point very soon. During this rollout I took the time to try out the feature and get a feel of how it works.

Document Conversations in action on OneDrive for Business

Browsing to my OneDrive for Business page nothing different appears on the display. As you can see below it still looks the same as when the ‘Site Folders’ rolled out earlier this year.


So to invoke the ‘Document conversation’ window you have to actually browse to the document. As you can see below a new right-hand slither has appeared with the Yammer logo and an indicator to click to expand.


Clicking the Document Conversations bar expands it into the right-hand pane, much like the Apps for Office do in Office 365 Pro. On first use or when you’ve not signed into Yammer you will be prompted to sign in. The screen grab is once sign in has been done. First thing to note here is it isn’t displaying any threads, that’s simply because I had newly created this document for the purpose of this article.


Next lets create a new ‘Yam’ from the ‘Document Conversation’ pane. At this time it allows the user to select a group to post the ‘Yam’ into, interestingly there is no way to setup a default for this. I think it would be awesome if Microsoft had provided a ‘default group’ setting on the hosting Document library settings, as i suspect the feed is using the Yammer Embed which has the ability to set a default group. That way users could configure their defaults and avoid everyone posting into the ‘all company’ group.


After posting the ‘Yam’ it can be seen in the ‘Document Conversation’ pane. Note how its being presented as an OpenGraph object and not a url.


Below is an example of a reply to the conversation thread.


This is the same conversation thread within Yammer.


Posting from SharePoint to Yammer

Before the ‘Document Conversation’ feature was designed and built one of the first Yammer integrations with SharePoint was the ‘Post’ option which appeared on the document hover panels. The screenshot below shows the ‘Post’ option on the same file we just used for the ‘Document Conversation’ demo.


Clicking ‘Post’ launches a modal window with a url in the message body so you can type a ‘Yam’. As you can see from the screenshot this is a pretty basic UI.


The screenshot below is the ‘Post’ in the yammer group.


Comparing the two approaches

So we’ve seen how both approaches seem to work. The one thing that i found puzzling was in the Yammer group i’d seen two threads about the same document. One from the ‘Document Conversation’ pane and one from the ‘Post’ modal dialog. This didn’t make sense on first glance I would have expected both to reference the same item (url) as the OpenGraph object.

Document Conversation

The ‘Document conversation’ thread has the following JSON returned from the API.


The body of the ‘Yam’ contained a url of the file path with query string ?web=1 which when launched opens the document in the Office Online app.


The OpenGraph object is detailed below. Again not the url has the ?web=1 querystring.


Post from hover panel

The ‘Post’ thread has the following JSON returned from the API. We can see nothing special in the thread itself.


The content returned by the ‘Yam’ this time shows the WOPI (Office Online) url has been used.


The view of the attachments xml confirms the information.


So doing it again

Via ‘Post’ creates a brand new post. The ‘Post’ feature adds a brand new OpenGraph object and thus starts a new thread, rather than finding the existing thread and presenting it back to the user in the popup window.



So the two methods use different urls thus become two different OpenGraph objects.

It would be great if Microsoft could bring them into alignment so that there is one solution url so all conversations appear in the same thread.

Delve YamJam summary


This week people who had their Office365 tenants setup with ‘First Release’ started to see the long anticipated Delve (formally Codename Oslo) arriving on the tenants.

Microsoft organised a YamJam for Delve in the Office365 Technical Yammer network here:

This article is a summary of the information which is correct at the time of writing.

Is Delve coming to on-prem?

A hybrid approach is more appropriate due to the complexity and processing power required to drive the OfficeGraph engine. There will be APIs to allow connection to other data sources for the signals driving the OfficeGraph.

Microsoft are planning a hybrid connector that can integrate signals and content from on-premises. They have no current timeline. This is probably going to feature for the scenarios where Lync and/or Exchange have an on-premises installation.

Privacy concerns

Some users concerns around privacy topics. The example cited was that a company Delve was showing trending documents for certain HR documents for example, psychological assistance and domestic partner coverage and maternity benefits. The question was around being able to exclude certain content from producing signals.

The documents could be excluded through the normal SharePoint permissions capabilities. Delve relies on the search index, so excluding a file or folder will exclude it from Delve as well.

Currently there is no feature to exclude documents from Delve but have them available to everyone via SharePoint/Search.

Side note about storing documents from HR in Yammer and the fact that ‘viewing’ it shows up in the activity feed in the top right. This gives people visibility on what other users are looking at, so someone looking at HR docs around maternity is kind of announcing that interest to the whole organisation. Not so good.

Delve does not show ‘who’ viewed a document ever. Trending is invoked when multiple people who have a relationship with you have accessed the doc. The author is the named entity. This is slightly confusing in the UI. At a glance the name appears to be the user who viewed the document. Careful communications would need to be done for this on rollout.

Delve only shows if someone modifies a document (this is available through SharePoint anyway). Delve doesn’t show who viewed the document, where many people have viewed the document Delve says several of your colleagues have viewed this document, but never divulges the names.

‘Trending’ does not mean a person viewed it, only that your colleagues are generating activity around it. (no information on the definition of activity).

CSOM / JSOM API availability

OfficeGraph will have an API. Current information is available here:

Could OfficeGraph be consumed in PowerBI?

In theory this should be possible as its got an API.

Restricting the rollout to specific users

Like an ESN Delve thrives on a wide and deep network of users. By restricting to s subset an organisation would fall into the ‘doomed social pilot’ trap of not enough signals to add the absolute value. Obviously this is an ESN success perspective. Organisations will have reasons for this request, regulation, change mangement and security were all cited.

Also it was noted that you can disable Delve at tenant level, it was unclear as to whether this is the Delve UI alone or included the OfficeGraph underpinnings.

When will I get it?

Currently this is being rolled out to ‘First Release’ tenants first.

What is the Delve UI item display limit?

Answer: 36 documents before adding filters by using search. Microsoft said 36 was chosen as the starting point through internal MS trail data. Their data showed that click rates dropped to zero at a certain point in the page.

Microsoft’s choice of name Delve

Mixed feelings, those who aren’t English speaking said that Delve doesn’t always have a real meaning in some languages. Others just preferred Oslo, and thought Delve didn’t really jump out. As with most questions like this, nothing really bad comes of the name. Lets just hope it doesn’t get a rebrand in 6 months 😉

How will Delve handle existing content and groups?

Being search based it can pick up everything in the tenant today.

Which plans get Delve?

Office365 E1-E4 and corresponding Gov and Academic plans.

At first release Delve gets signals from Exchange, OneDrive for Business, SharePoint Online and Yammer. Primary content surfaced from OneDrive for Business and SharePoint Online team sites.

What determines the people list?

This is the top five people you interact with.

Useful links

Delve documentation:

Delve for Office365 admins here:

OfficeGraph API documentation

An introduction to PowerBI


PowerBI is the cloud version of Microsoft BI stack. After seeing some awesome things at SPC14 I decided to take a dive into the course material on MVA.

Notes from watching the MVA course on PowerBI

(MVA training course)

Power Query

(MVA session)

Power Query represents the ‘getting data’ part of the stack. It’s available as an Excel 2013 plugin (Download from Microsoft). Over time this will be the way to get data into Excel, replacing the ‘Data’ and Power Pivot ‘Data’ ribbon features.

It has a huge number of available data source connectors:

  • From web which can pull in web based data sources
  • From file which can pull in from files like csv
  • From database which can pull from a wide variety of database systems
  • From other sources, basically a collection not matching the above categories such as SharePoint lists, the Azure services, Facebook

This list will continue to grow as the Microsoft team build more and more extensions.

There is a great feature for finding data. The ‘Online Search’ can find published data and queries from across the web (Microsoft maintain a huge collection in their catalogue), from you internal organisations catalogue and from shared queries.

Once a query is configured Power Query does as much as it can to pass the right query to the datasource. So for example if you’re pulling relational data from a DBMS Power Query is passing down the restrictive query to that DBMS to push the query load onto that system. This helps to ensure performance is optimal. Where a user is using flat files then it really can’t be offloaded so the local machine bears the brunt of the performance hit. This would be something to consider when designing solutions. I personally think this intelligence is amazing and could really play well when designing a self service BI solution.

The query can pull the data into either the local Worksheet or into the Data Model. Pulling into the Worksheet is really best employed where the dataset is of the smaller end of the spectrum, pulling into the Data Model is where larger data collections would benefit. Also consider your file size, bringing the data into the model drastically reduces its footprint. They gave an example of a 4Gb file being compressed into about 160mb file. This is important when you consider the maximum file size in PowerBI is 250mb.

Another neat feature is pulling data from a ‘folder’ into one collection. So you can bring together multiple csv files for example in a straight forward fashion.

The query can actually parse things like JSON which opens up possibilities to call data services and simply transform them to tables and related tables.

Power Query Editor

Once you create a new PowerQuery it launches the editor where you can perform loads of neat things on the data.

  • Manipulate the columns, like ordering, name, format, using them as headers
  • Splitting columns by delimiters etc.
  • Filtering
  • Removing duplicates
  • Merging additional queries into the same sheet, thus merging datasets
    • Bringing in just columns you want
    • Creating your own dimensions
    • Choosing which columns you merge on
  • Unpivoting data, as you often get data in a pivoted format as it is generally how it gets presented for reporting
  • Formula bar, which allows you to use the query language to create additional data interactions

Data stewardship

(MVA session)

Session about examining the problem space, what are the things business is trying to solve?

Information challenges:

  • Searching for data takes time, there are lots of useful datasets within an organisation which are often not available to others
  • Preparing it for use takes time, it is not always clean or in a sensible format for reuse, lots of redundant processing
  • IT spends time trying to service requests, due to the platforms and processes most users have to get IT to do the heavy lifting to create views, reports etc., they also often can’t react in a timeframe the business needs
  • IT has to also govern access and data use
  • Lack of trust is significant with business user community, because the stewardship of the data is often unknown most users question the validity or authenticity of the data

In some organisations the role of ‘Data Steward’ exists. These people tend to have a foot in both end user and IT camps. They tend to have understanding of what IT have provisioned and what the business needs.

Quote from Gartner “… only business users close to the content can evaluate information in its business context”

Data stewards can promote queries into the corporate data catalogue and help the business users understand them.

(note to self – using Yammer groups to help here would be worth looking at)

It’s an interesting viewpoint. Consider a scenario where a dataset is held in an Excel workbook and that gets emailed to another person. At this stage you have now forked the data and it has instantly become less trust worthy and accurate. Now imagine if instead you’d shared the query, now the data remains a single source and thus more trust worthy. In business it’s often the earlier stages of report creating that datasets which could be useful to be shared are created. By looking to share the queries rather than the end product reporting we are enabling more uses of the data building blocks.

The common steps an information worker is taking:

  • Identifying the need, what is the problem they are trying to solve
  • Identifying the data, what data could help them solve the problem, which domains are they from, who owns them

The ‘Data Steward’

(taken from the MVA slides)


Data Catalogue

This is the way we can promote the data:

  • Stores and process the metadata about the data sources available, users and their relationships
  • Provides the search functionality
  • Connects to the corporate data through the integration layer

It is a set of Office365 services.

Important note, sharing to the catalogue is only sharing the metadata of the query not the queried data itself. So the important thing to realise is the user who executes the query needs to have access to the data still.

Power Query

For Information workers:

  • Search for and access relevant data
  • Filter and merge data

For Data stewards:

  • Define repeatable data queries
  • Publish and share and annotate those queries

Data Management Portal

  • Manage metadata shared queries and published data sources
  • Monitor telemetry, the usages search etc.
  • Gain insight on data lineage

Admin Centre

  • Govern the data integration layer, admin the on-prem access

Data Publishing

Diagram taken from MVA session


Shows the multiple ways to publish your data to the data catalogue.

Every Office365 PowerBI tenant gets their own ‘Corporate Data Catalogue’.

The admin centre in Office365 looks something like this


By default when a user searches they search across both Corporate and the Microsoft public catalogues.

Some of the old drawbacks about documenting data services were that often finding the endpoint service and reading about it were disconnected. This led people to question the investment in the documentation as it was often missed. With PowerBI this descriptive is right there next to the result in PowerQuery making the metadata instantly accessible and therefore valuable.

To get all these capabilities you need to be ‘signed in’ to the PowerQuery via your tenant. Sounds obvious but its very important to factor into the messages during rollout.

Share Query dialog

So as the Data Steward you can choose to ‘share’ your query from within Power Query once you’re happy with the query configuration.

The example from the session looked like this.


So the shared query has the following:

  • Name which is the visible name for that query, here you should be descriptive to help people quickly scan a list for it
  • Description, helping further explain the data query, one point to think about is what sort of copy style your organisation needs, it shows up anywhere the query appears
  • Data sources, this lists those data sources being used inside this query, this will be important when you consider the data lineage
  • Sharing settings, if the user sharing is part of the ‘Data Steward’ role then they can certify the data as officially supported/verified as truth, ‘Share with’ is pretty obvious
  • Documentation url, the url to anything which is classed as the documentation, in larger organisations this could be the direct document office online url
  • Upload preview rows, does exactly what it says on the tin

Note that the data sources within the organisation can themselves be annotated to provide friendly information to the user. In the dialog above the ‘view portal’ link goes to the PowerBI admin screen which allows further description to be created for that data source. Notice the contact information to help a user to gain access to this resource.


Some insights from the analytics would allow you to work out the highest utilised queries/data sources and some that show up a lot in search but are rarely used. All this starts to help drive value and returns from the exploration and sharing of corporate data.

Power View

(MVA Session)

PowerView is the ‘visualise’ element of PowerBI. It is the interactive, data exploration, visualisation and presentation capability. It is based in Excel 2013.

The session scenario:


A retailer establishment who’s sales are collected through the POS system and contained in the traditional hierarchy of Category – Sub Category – Item

The role in the scenario is of the ‘Bar Owner’ who has little technical experience but wants to analyse his POS data to create some targeted promotions.

The demo brought in data into a Power Pivot model as below:


The basic step to start is to go into your sheet and from the ‘Insert’ ribbon choose ‘Power View’. The Power View brings in all the available fields and tables from the data model.

Three ways to add fields to the canvas:

  • Check the check box in the fields list in the tool pane
  • Drag it to the canvas
  • Drag it to the ‘Fields’ list in the tool pane

Once you have selected the fields you can change the representation from the ribbon options. Manipulating the display is as easy as selecting the ‘style’ you desire and from the tool pane you can manipulate the fields and filters.

It has some very neat features that when you interact with one of the graphs it will highlight the same data pivots in other graphs in the sheet. This ‘cross highlighting’ works across all the graphs in the view. Even legends are interactive.

Removing the title from individual graphs is done by selecting the graph and from the ‘Layout’ ribbon selecting ‘None’ from the ‘Title’ button.

By the end of the report creation it looked like this


If the default colours and design are not to your liking you can change the theme and background from the ribbon.

To add filters to the report, simply drag them from the tool pane list onto the ‘Filters’ section of the report.

Due to the interactivity of the report, a user can really drill down and use the canvas elements to gain insights into their data.

Demo included adding a ‘KPI’ from the PowerPivot ribbon.


Pick the column to base the KPI from and then configure the values and display as you require. Once added it will update the PowerPivot data model and this new field appears in the Power View tool pane for selection (it has a traffic light icon).

Adding a in canvas filter is as simple as dragging the field from the tool pane onto the canvas. It will then add that single column onto the canvas, but we can go a step further and turn it into a ‘Slicer’ from the ribbon. Now when a user selects a value from that column it cross highlights the rest of the canvas.

Demo then showed how to rollup the view using ‘Matrix’ for the main table and enabling the ‘Show levels’ to rollup the data further. To modify which columns get displayed is again done through the tool pane options.

In the tool pane when you drag a single field to the ‘Tile By’ option and it creates tabs which creates a design surface within the tabs. You then set up the graphs within this surface.

Power Map

(MVA Session)

Power Map is the mapping visualisations for your data.

Different steps to go through to create a Power Map visualisation:

  • Map data
    • Data in Excel, Power Pivot data model
    • Geo-code with Bing
    • 3D and 4D visuals
  • Discover insights
    • Play over time frames
    • Annotate anything interesting
    • Capture a timeframe as a scene
  • Share
    • Add effects
    • Interactive touring
    • Share workbooks

From the ribbon you can choose the ‘Map’ option.


Building a new tour

A tour allows you to present the information and guide the viewer across it.


The left hand side is the scenes, the centre is the scene content and the right hand side is the data.

Interesting note is that the ‘other’ column can basically allow you to map anything that the Bing mapping API could map to be used from your data. The example given was airport codes.


Adding more than just text annotation, the example here is a picture


During a tour you can still interact with the map, pause and drill down into the display.


PowerBI and mobile BI

(MVA Session)

You can add the PowerBI from the app catalogue.

As a note you can default site samples into your PowerBI site to help you get an understanding of some of the features.

To have PowerBI features light up for a specific Excel document it needs to be ‘Enabled’ in the options (part of the tile).

From the PowerBI dashboard clicking the file opens it within the Office Online.

‘Featured Reports’ is an interesting feature as it allows you to promote a report across all the users of your PowerBI capability. This is a simple as clicking ‘Add to featured reports’ from the tile context menu. There are no specified limits to the number of reports that can be set to ‘featured’.

If you don’t want to share a report to everyone but still want a short cut for it you can select ‘Favourite’ from the tile context menu to effectively pin this report as favourite. These are then found in the ‘My Power BI’ in the suitebar menu.

You can also schedule a data refresh from the tile context menu. This will call the data source and invoke a data refresh into the report.

Mobile PowerBI

You can install the Windows 8 app from the store. (Interesting it’s mobile when in fact its the Windows OS Winking smile)

By default it contains the Microsoft sample data. As seen in the screen grab below.


You can then add your own reports by ‘Browse’ from the context bar, select your site and pick up the Excel file containing your report. Tag it as favourite and it’ll show up on the main dashboard. Clicking into the report will then load the same Power View report as you would see in the browser.


You can also take advantage of the normal ‘Share’ capability from Windows 8. Here I’m choosing to share a report via email.


The thumbnails for the reports are also dynamically updated at intervals to reflect the actual view of the report. So even this adds a ‘peek’ style to the dashboard.

Natural Language Querying with Q&A

(MVA Session)

Q&A is natural language querying. Its primary focus is about data exploration.

Key problems Q&A is aiming to assist organisations with:

  • How do I find the right data to answer the question I have at this point in time? There could be numerous data sources and how would you know which one to look at?
  • How do I find the right answer from within the huge collection of reports? Maybe the reports just don’t give the right slice of data or present the information in a form which makes sense.

So Q&A was born to address both of these challenges.

The demo scenario is covering a typical customer data model.


From within the PowerBI site to enable a Excel workbook for Q&A you choose the ‘Add to Q&A’ from the tile context menu.


This is an important step to remember as Q&A only searches within ‘enabled’ workbooks. Another thing to note is that SharePoint permissions still dictate what a user can access. For example if a workbook is only shared with certain people, then only those people can Q&A against it even though it is enabled.

To use Q&A you click the option in your PowerBI site and you’ll see something like this below.


Notice that the UX is very ‘search’ like, this was done consciously by Microsoft to entice users to use Q&A like they would a search engine.

So after the user starts to type a question the Q&A begins to work its magic. Immediately it narrows down which data models it might be using. Notice the blue text below the query box is the ‘Restatement’, this is how Q&A is transforming your typed question and querying the data models. Also below that are query suggestions that Q&A thinks you might also be trying to perform.


A really neat feature is the visual representation of the returned query is best guessed by Q&A to provide the user with what it considers the best display for the results. On top of this is can also apply filters, as you can see I’ve added a ordering to the query.


As you can see the filter pane is also available within the canvas for further filtering and settings.

The query box will also help the user because it greys out words which it can’t map to the data model.

Building models for Q&A

Sometimes there are things that just need some extra work to help Q&A assist the user. Things like naming conventions in the data model versus the naming the average business user uses. You can use ‘Synonyms’ in the data model to teach Q&A the other terms.

The formatting and modelling you use to drive Power View is also supporting Q&A providing a better experience.

  • Use the correct data type for columns to help Q&A understand how to make the query to the model.
  • Set up the ‘Default Field Set’ from the ‘Advanced’ ribbon. These settings determine which fields are brought onto the canvas in Power View and they also allow Q&A to display those fields when a user just types the name of the table.
  • Set up the ‘Table Behaviour’ from the ‘Advanced’ ribbon. Setting the ‘Default Label’ allows Q&A to use the correct axis.
  • Set up ‘Synonyms’ within the data model from the ‘Advanced’ ribbon.

Note: At the time of writing this feature only comes with the Office 365 Office version from your tenant software subscription.

Synonyms are displayed within a tool pane against your model. Within each field you can add other words which might be used while looking for that column in a comma delimited list. The first one listed is the ‘primary noun’ for that entity, this is what Q&A will use in the restatement (the blue text under the user entered query).


Sharing with Q&A

The first option is to simply copy the url, like most search experiences, once you enter a query the url reflects this query. Thus you can distribute this url to other to effectively reuse. If you think about this wider, imagine you are asked a question by a business user, you can now find the data via Q&A and help them jump in, and then further explore from that point.

The second option is to add the question the ‘Featured Questions’ in the PowerBI site.


This allows you to setup various settings for the question before adding it, such as the question itself, whether to show it on the home screen, size and colour and finally and image. The image below is the home screen with the question added.


This image is of the Q&A screen with the question listed.


Behind the scenes in Q&A

The overall steps are:

Search for interpretations of question – looking across all the workbooks enabled for Q&A and finding which ones could answer the question.


Score and select the best interpretation – ranking the workbooks answering capability.


This ‘Ranking’ means that Q&A will be using the top option.


Select best visual – determine which visual can explain the answer. This runs through a rules engine.


Influencing the best answer

Search for interpretations of question > Use data modelling and synonyms

Score and select best interpretation > workbooks in the site

Select best visual > data modelling (data types, data categories)

Data Management Gateway

(MVA Session)

The Data Management Gateway has three main capabilities:

  • How can I enable corporate data feeds over OData
  • How can I enable discovery in Power Query, enabling people to find data
  • How can I refresh Excel Workbook data suing SharePoint Online

The conceptual layout for the Data Management Gateway is detailed in the diagram.


The Data Management Gateway sits in two places, one on-prem through a client and one online through PowerBI (PowerBI admin centre).

How does it work?

OData feeds through the Data Management Gateway


  1. Power Query requests data from the OData feed
  2. Data Management Gateway connects to the data source
  3. Results are returned to the Data Management Gateway
  4. The Data Management Gateway returns the data to Power Query

Data refresh


  1. Excel workbook is loaded into SharePoint
  2. Data refresh is called
  3. Connects to the Gateway Cloud service
  4. The Gateway Cloud service checks the users authorisation to perform a refresh
  5. If authorised sends the command to the on-prem Data Management Gateway
  6. Data Management Gateway sends the command to the data source
  7. The data source returns the results to the Data Management Gateway
  8. The results are transferred up to the Gateway Cloud service
  9. Returns the data to the Excel workbook

PowerBI Admin Centre

The admin centre provides the following:

  • Allows you to install and monitor the Data Management Gateways for the organisation
  • Configures access to the cloud enabled data sources
  • Exposes OData feeds to the corporate data sources
  • Configures the PowerBI user roles

You can access the Admin centre from either your tenant admin portal or directly via


Interesting note, that the data source setup can be configured with either Windows or Database account. BUT you still need to add users to the datasource for them to actually access it.

SP.RequestExecutor cross domain calls using REST gotcha


Recently I was building a prototype SharePoint hosted app to add items into the Host web. The basic operation of the app is that it queries the Host web for specific list types, then allows a user to add a collection of new items to the selected list. So read and write operations.

When dealing with the Host web it is important to remember that you are then subject to ‘cross domain’ calls and the restrictions in place for them. The browser protects users from cross site scripting and specifies that the client code can only access information within the same URL domain.

Thankfully SharePoint comes with some inbuilt options for these calls. The Cross Domain library is the primary option in either JSOM or REST forms.

I’ve been leaning towards REST mainly at the moment primarily as a focus for learning so I could get used to this method of data interactions.

So the first code sample is to get the host web task lists:

NB: This is a cut down extract of the function just to highlight the core request.

var executor;

// Initialize the RequestExecutor with the app web URL.
executor = new SP.RequestExecutor(appWebUrl);

//Get all the available task lists from the host web
appWebUrl +
“/_api/SP.AppContextSite(@target)/web/lists/?$filter=BaseTemplate eq 171&$select=ID,Title,ImageUrl,ItemCount,ListItemEntityTypeFullName&@target='” + hostWebUrl + “‘”,

method: “GET”,

headers: {  “Accept”: “application/json; odata=verbose” },
success: successHandler,
error: errorHandler

Note from this sample how the host and app web urls are used within the url combined with the SP.AppContextSite. This is the key to invoking a cross domain call in REST using the SP.RequestExecutor

The second snippet of code is the one which adds the new item to the host web list:

NB: This is a cut down extract of the function just to highlight the core request.

var executor;

// Initialize the RequestExecutor with the app web URL.

executor = new SP.RequestExecutor(appWebUrl);

var url = appWebUrl +
“/_api/SP.AppContextSite(@target)/web/lists(guid'” + hostWebTaskList.Id + “‘)/items?@target='” + hostWebUrl + “‘”;

//Metadata to update.
var item = {
“__metadata”: { “type”: hostWebTaskList.ListItemEntityTypeFullName },
“Title”: item.title,
“Priority”: item.priority,
“Status”: item.status,
“Body”: item.body,
“PercentComplete”: item.percentComplete

var requestBody = JSON.stringify(item);

var requestHeaders = {
“accept”: “application/json;odata=verbose”,
“X-RequestDigest”: jQuery(“#__REQUESTDIGEST”).val(),
“X-HTTP-Method”: “POST”,
“content-length”: requestBody.length,
“content-type”: “application/json;odata=verbose”,
“If-Match”: “*”

url: url,
method: “POST”,
contentType: “application/json;odata=verbose”,
headers: requestHeaders,
body: requestBody,
success: addPrimeTasksToHostTaskListSuccessHandler,
error: addPrimeTasksToHostTaskListErrorHandler

Ok so at this point you’re probably wondering what is the gotcha mentioned in the title. Well here it comes and it’s one of those cut and paste horror stories which costs developers all over the land huge amounts of wasted effort.

So if you take a look at the following block of code

var url = appWebUrl +
“/_api/SP.AppContextSite(@target)/web/lists(guid'” + hostWebTaskList.Id + “‘)/items?@target='” + hostWebUrl + “‘”;


//TODO: find out why this works but the below fails with not actually performing the update
url: url,
type: “POST”,
contentType: “application/json;odata=verbose”,
headers: requestHeaders,
data: requestBody,
success: addPrimeTasksToHostTaskListSuccessHandler,
error: addPrimeTasksToHostTaskListErrorHandler

You’ll notice i copied over the structure of the method from a normal $.ajax call. THIS IS MY MISTAKE!!!!

As with many things the devil is in the details. By using this ajax snippet I’d introduced a bug which took nearly 4 hours to work out (very little found on the popular search engines about this). The worst part is that the call fires and comes back with a 200 success and even enters the success handler, BUT the action is not performed.

So what is the cause? Well basically there are subtle differences in signature.

  • The ajax call ‘type’ should be ‘method’ in the SP.RequestExecutor
  • The ajax call ‘data’ should be ‘body’ in the SP.RequestExecutor

So there you have it, two word typo’s which throw no errors but cause a logical failure in the code.

I hope this helps someone else avoid the pain Open-mouthed smile

Some really useful information about this capability can be read at:

Chris’ app series covers using this library in anger –

Apps for Office and SharePoint blog article discussing the inner workings and options for cross domain versus cross site collection –

Using REST in SharePoint apps –

One final comment, MSDN Code has this sample: which doesn’t really demo cross domain at the time of writing as it isn’t using the code correctly against the host web in my opinion.

Awarded Microsoft MVP 2013 for SharePoint Server


Literally minutes before leaving work on 1st  October I received the following email:


This left me utterly speechless, which is not something that normal happens to me. After re-reading it about ten times the news finally sank in. I’m ecstatic that Microsoft have awarded me the MVP award for my contributions to the SharePoint community.

So I wanted to share a number of thank you’s for everyone who has supported me along the way.

Most importantly my beautiful wife who has supported my many many hours chained to my PC in the office at home beavering away on CKSDev or Presentations instead of snuggling up in front of a movie on the sofa. Without her understanding and support I definitely wouldn’t have found time to make the contributions I have. Also sitting through many hours of practice presentations before my speaking events.

Matt Smith for four years ago helping me to understand how to start to become involved in the SharePoint community.

Waldek Mastykarz for his friendship, support and advice and those midnight coding adventures before releasing a new version of CKSDev. Thanks buddy 🙂

Chris O’Brien for his friendship, advice and putting up with some seriously deep technical conversations over the years when I’m trying to get my head around a SharePoint feature and Visual Studio API.

The CKSDev team past and present, Wouter, Matt, Waldek, Todd, Paul, Carsten and Wictor who all at some point over the last few years have advised, written some cool code and generally helped ensure CKSDev was meeting everyone’s needs.

The event organisers for SUGUK, Evolutions/Best Practices London, SPSUK and SPSNL (Steve Smith and Combined Knowledge, Brett, Tony and Nick, Mirjam and the DIWUG peeps) for providing me opportunities to present and be part of the event teams.

My employers Content and Code and especially people like David Bowman and Tim Wallis who gave me time and support to contribute to the community. Not to mention Salvo di Fazio, Tristan Watkins and Ben Athawes.

To conclude what has been quite an emotional post to write I just want to say thanks to everyone in the community and MVP community who have helped me all these years. There are too many to list, but you know who you are 🙂

CKSDev for Visual Studio 2012 version 1.2 Released

The 1.2 version has just been released. It contains loads more features you had available in VS2010. All of the other features will be coming over the coming weeks as personal time allows. As you can imagine it’s no mean feat to port all of the existing features and support multiple SharePoint versions. The code base is also on a diet as code bloat was getting a little bit crazy as the tools evolved over the past 4 years.
You can find CKS: Development Tools Edition on CodePlex at
Download the extension directly within VS2012 from the extension manager or visit the Visual Studio Gallery.
We’re still looking for extension ideas so please post in the CKS Dev Discussions anything you think would make life better for SharePoint developers.

CKS Dev Feature highlights

  • Copy Assembly Name – From the context menu of a SharePoint project you can now get the full assembly name copied to the clipboard.
  • Cancel adding features – Automatic cancellation of new SPIs being added to features. You can turn this off via the CKSDev settings options.
  • Find all project references – Find where a SPI/Feature is being used across the project from the project context menu.
  • Activate selected features – Setup which package features you want to auto-activate from the project context menu.
  • No new features
  • ASHX SPI template – Produces a full trust ASHX handler.
  • Basic Service Application template – Produces a full trust basic service application.
  • Banding SPI Template – Produces a full collection of SP2010 branding elements baseline items.
  • Contextual Web Part SPI template – Produces a contextual ribbon enabled web part.
  • WCF Service SPI template – Produces a full trust WCF Service endpoint.
  • Web template SPI template – Produces a SP2010 web template.
  • Improvements to Quick Deploy – Performance improvements with a switch from calling into GACUtil.exe and returning to direct GAC API calls to improve performance. Also removal of ‘custom tokenisation’ for now until a more performant version is tested.
  • Quick deploy GAC/Bin deployment configuration – A deployment configuration which runs the pre-deployment command line, recycles the application pool, copies the assemblies to either the BIN or GAC depending on their packaging configuration, and runs the post-deployment command line.
  • Quick deploy Files deployment configuration – A deployment configuration which runs the pre-deployment command line, recycles the application pool, copies the SharePoint artefacts to the SharePoint Root, and runs the post-deployment command line.
  • Quick deploy all deployment configuration – A deployment configuration which runs the pre-deployment command line, recycles the application pool, copies the assemblies to either the BIN or GAC depending on their packaging configuration, copies the SharePoint artefacts to the SharePoint Root, and runs the post-deployment command line.
  • Upgrade deployment configuration – A deployment configuration which runs the pre-deployment command line, recycles the application pool, upgrades the previous version of the solution, and runs the post-deployment command line.
  • Attach To IIS Worker Processes Step – Which attaches the debugger to the IIS Worker process during deployment.
  • Attach To OWS Timer Process Step – Which attaches the debugger to the OWS Timer process during deployment.
  • Attach To SPUC Worker Process Step – Which attaches the debugger to the User Code process during deployment.
  • Attach To VSSPHost4 Process Step – Which attaches the debugger to the Visual Studio deployment process during deployment.
  • Copy Binaries Step – Copies the assemblies during deployment to either Bin or GAC.
  • Copy To SharePoint Root Step – Copies the files during deployment to the SharePoint Root.
  • Install App Bin Content Step – Copies the files to the App Bin folder during deployment.
  • Install Features Step – Installs the features during deployment.
  • Recreate Site Step – Recreates the deployment targeted site during deployment.
  • Restart IIS Step – Restarts IIS during deployment.
  • Restart OWS Timer Service Step – Restarts The SharePoint Timer service during deployment.
  • Upgrade Solution Step – Upgrades the solution during deployment.
  • Warm Up Site Step – Calls the site to get SharePoint to warm it up during deployment.
To see more details and the full feature list visit

Visit the CodePlex site for more information about the features and release.

Share and Enjoy and thanks for the continued support

SPSNL 2013 Presentation


Saturday 29th June saw the 2013 SharePoint Saturday Holland. Another great event organised with so many quality speakers and companies in attendance. It was a privilege to be invited to speak Smile

I presented a session on Apps for Office 2013 and SharePoint 2013, the slides can be seen below. I hope everyone found the session useful Smile I certainly enjoyed presenting to such an interactive audience.

I think Apps for Office is one of the coolest new features of Office and SharePoint 2013 and my session gives a really quick overview of the Apps for Office solution space. Then the hook up between SharePoint and Office that is now possible through the demo solution.

Over the next few months I’ll be publishing a dedicated series for Apps for Office so stay tuned for more soon.

Thanks to everyone who helped organise the event.

Custom SharePoint Item and Project Template gotchas for VS2012


This post is a quick brain dump of some of the challenges I’ve been working through updating CKSDev to work with Visual Studio 2012.

Item Templates


You can extend the SharePoint Visual Studio tooling in many different ways. One of which is by creating new Item Templates. Item templates are the code templates which appear in the ‘add new’ dialog in the solution project. Microsoft wrote several good walkthroughs as an example.

In Visual Studio 2010 the SharePoint items were grouped in a basic ‘SharePoint’ category and deploying them via a VSIX package was actually dead simple. Edit the CSProj and replace this VSTemplate element with the following XML, and then save and close the file.

<VSTemplate Include="ItemTemplate.vstemplate">

The OutputSubPath element specifies additional folders in the path under which the item template is created when you build the project. The folders specified here ensure that the item template will be available only when customers open the Add New Item dialog box, expand the SharePoint node, and then choose the 2010 node.

So that worked fine in Visual Studio 2010 for SharePoint 2010. CKSDev for VS2010 had some smarts under the covers for packaging, but also followed this basic process.

So why is this an issue for Visual Studio 2012????

Well as you can see Microsoft moved the items into a new category called ‘Office/SharePoint’


In true Lt Gorman style (Aliens reference when the troops realise they can’t fire the pulse rifles under the reactor)

SO??, So what?

Rather then setting off a thermo-nuclear explosion this just causes a real headache for custom template VSIX packaging.

Each VSTemplate item in your solution has a property called ‘Category’, this is where you can tell VS to package that template under a specific category. Great so all you have to do is match the path in the new item dialog. Yes in theory.

The challenge comes that the VS/MSBuild bits behind the scenes understand a forward slash as a new folder. So you end up with Office with a sub folder of SharePoint. Not what you wanted. Sad smile

Ok so what next?

Well another thing to try is to match the OOTB folder path for the native MS templates. They live under “C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\ItemTemplates\CSharp\Office SharePoint\1033”

So setting the ‘Category’ to ‘Office SharePoint’ should work right? Wrong more Crying face ensue as this time the space is encoded to %20 and the templates don’t show up at all.

This process of elimination had by this point taken well over two hours and was getting somewhat frustrating.

So as i mentioned earlier CKSDev already has a subtle difference to its packaging. This is due to it being distributed via the Visual Studio Gallery. The VS gallery has some nasty folder length checks (less that a combined 256) which mean the CKSDev templates are compacted folder and name wise to be as short as possible. The VS Gallery rules don’t understand the fact the tools couldn’t install on XP and check for the length anyway and reject if over 256.

Another example of this technique in action is here at Aaron Marten’s blog article which explains how this works.

The CKSDev packaging uses some MSBuild stuff to create an ‘I’ for item templates and ‘P’ for project templates during build. The VSIX then picks them up from there.

The assets section of the VSIX looks like this:

    <Asset Type="Microsoft.VisualStudio.MefComponent" Path="|CKS.Dev11;AssemblyProjectOutputGroup|" />
    <Asset Type="SharePoint.Commands.v4" Path="|CKS.Dev11;SP2010CommandProjectOutputGroup|" />
    <Asset Type="SharePoint.Commands.v5" Path="|CKS.Dev11;SP2013CommandProjectOutputGroup|" />
    <Asset Type="Microsoft.VisualStudio.VsPackage" Path="|CKS.Dev11;PkgdefProjectOutputGroup|" />
    <Asset Type="Microsoft.VisualStudio.Assembly" Path="|CKS.Dev11;AssemblyProjectOutputGroup|" />
    <Asset Type="Microsoft.VisualStudio.ProjectTemplate"
           d:VsixSubPath="P" />
    <Asset Type="Microsoft.VisualStudio.ItemTemplate"
           d:VsixSubPath="I" />

So the solution was to modify the MSBuild elements for the CSProj file

<Target Name="GetVsixTemplateItems">
     <VSIXSourceItem Include="@(IntermediateZipItem)">
     <VSIXSourceItem Include="@(IntermediateZipProject)">


The item templates are using a variable declared with the category (aka the folder path) desired, whereas the projects are declared using the ‘Category’ in the properties window of the VSTemplate item. (this is the output sub path for those interested).

The path has to be declared in another element as it contains a space.

    <ItemTemplateFolderPath>Office SharePoint\CKSDev</ItemTemplateFolderPath>

All that together gives the desired effect of all the CKSDev items appearing inside the right OOTB category.



Ok so now everything is sitting in the desired place. As with a lot of CKSDev coding its the little things like this which take so long. Understanding Visual Studio, MSBuild and SP API’s all in a days work Smile with tongue out

Project templates


So with the information above the project template settings are actually a lot easier.

Simply set your category to ‘Office\SharePoint Solutions’ As you actually want it to appear in a sub folder this time.


I hope this saves someone time and headaches Open-mouthed smile