Friday, 14 December 2018

It's a wrap: Office Development Bootcamp 2018

For the second year in a row I had the pleasure to organise Office Development bootcamp along with my good fellow MVPs Ashish Trivedi & John Liu. This year was a bit special for me as we didn't only deliver the usual three bootcamps we had last year (Sydney, Melbourne and Auckland) but we have added three additional cities to the mix:
  • Brisbane: Organised by my fellow MVP @ChrisGecks 
  • Hong Kong: my first attempt to remotely organise an event and It was awesome turn out of 66 people , great job by Microsoft HK
  • Kuala Lumpur: second attempt to organise something remotely and just very close to the perfection of HK event
I personally can't wait for next year's event and potentially add at least three more cities.

Sunday, 11 November 2018

Yo Teams: Running local https server

I've been using Teams Yeoman generator for quite a long time , also I've one contribution to this awesome opensource project.
However, I've always wondered why it's running on local http while the manifest is requiring the tabs endpoint to be an https endpoint, so If you are building a Microsoft Teams tab, you won't be able to run it locally without enabling https to your local server.
The method I used to get around this is using ngrok and use the https endpoint, but deep inside I didn't want to expose my local tab code to run externally, maybe someone out there is trying out all the possible ~4.2 billion possible sub-domains 16^8 (nah just kidding). I think I was just determined to run tabs locally using https server.

So I've updated my fork with the latest updates since my last contribution (almost a year ago), then did the following steps to make the generator creates https local server
  1. I've create new local branch and called it https (very creative name !)
  2. I've noticed I need to generate certificate and private key using openssl command (you can either install openssl.exe win 32/64 binary or run openssl through Ubuntu on windows 10  if you have a windows machine
  3. I've placed the two files in a cert folder under the main app template which is common across all Yo Teams artefacts
  4. I've used a webpack plugin "copy-webpack-plugin" to allow me to copy the cert folder
  5. To make sure that webpack will spit out my two files I've added the following line to the webpack server entry under plugins, don't forget to install copy-webpack-plugin package 

  6. Now to the fun and easy part, which is changing the server.ts class, first change import * as http from http to "https" and then replace the server creation with the following lines
  7. to run this version of the generator , run npm link
  8. now link your folder with the new generator, run it and when you type gulp serve, an https server will be created and you can use local server to run

Sunday, 14 October 2018

Yo Teams: Azure App services Deployment error

It has been a while since I played around with generator-teams (Yeoman generator for Microsoft Teams), it was almost 6 months ago since I demoed the capabilities of this amazing open source project. This time I've created a quick project which includes a simple tab, my intention was to run it locally and also publish it to azure app services.

I was very straight forward process publishing the Teams tab to Azure app service using local Git repository and push my master branch to it. However, this time it wasn't the easy ride I expected.

I won't go through the abvious steps which is setting up your environment for nodejs development and installing the latest Yo Teams package (2.5 at the time of writing this post)
long story short, I created a new teams app with only a simple tab and created a new azure app service and added local git as a deployment option so I can push my code to it and achieve a very simple deployment to azure app services.

After the awesome generator created my artefacts, I've run local npm install and gulp build and it was rocking, everything was working fine locally. I initialised my git rep, added azure git as a remote (I called it azure very creative !), then I've pushed my master branch to azure. I was waiting for the magic successful deployment message but instead I get the following error

apparently by default the node and npm version used by app services are v0.10.40 and v1.4.28 which are relatively old and caused some errors in npm.

Using App settings you can set the nodejs version for the app service instance but I couldn't find a switch for npm which was the one actually causing the error above, so I decided to use another way to specify the node & npm versions by adding them to package.json file as below:

made a minor update and pushed the new version to azure app service local git repo, and yet I stumble upon another error

I created a new folder called dist under wwwroot so the script should be able to create the issnode.yml file , I thought that's it however I find another error this time in the gulpfile.js syntax.

so the reason this time is node version is used to run the scripts is the same old version so I had to add a new line to deploy.cmd just before the build command line.
and finally, my deployment is successful

and what supposed to be a one minute job turns out to be half an hour job.

Tuesday, 21 August 2018

SPFx: How OTB News Webpart displays ViewCount

In this blog I'll explain how the out of the box news webpart displays the view count for each promoted site page aka "News Pages". At the first glance when a colleague asked me I answered naively by
"Oh Check ViewCountLifeTime managed property" thinking -silly me- that modern pages webpart will somehow uses the same technique that old publishing pages used to embrace, which has viewCountLifeTime, viewsLastNDays  managed properties which used to give us a lot of options to choose precisely what we want to display.

Back then we weren't worried about where SharePoint stores these values as it will all be enriched via search pipeline, which was easy, awesome and just works.

When my colleague tried out the search with some of the classical ViewCount managed properties, they got nothing which left me scratching my head and honestly questioning my sanity. I was a bit away from hands-on role and by a bit I mean it was almost more than four years since I've been rolling my sleeve and coding stuff as I'm currently doing mainly high-level activities. With no further do I decided to look under the hood of of OTB news webpart

First thing I noticed after viewing the bundled code using pretty format feature in google chrome in the file sp-news-webpart.bundle_en-us_someguid.js when the component is about to mount a function called "updateRealNewsItems" is called, this method takes a set of items (which is the news items with the view count ) as a parameter.

With a bit of more digging by searching the "viewCounts" I manage to find an expression that verified whether the view count is needed or not which depends on the display template that you choose.  If the view count is needed firstly the code tries to update view count dataprovider information then calls refresh data.

The code continues with preparing a request , getting the view count , cache it so next time it will try to get it from the cache then add the view count to the news items which is returned by search.

That kept me wondering is it that hard to enable search pipeline to update the ViewsCount to accommodate the Site Page content so we can rely on viewCount managed property. I really have no answer to that maybe it is. Additionally I believe that if modern webparts are available on GitHub it will make SharePoint developer life easier.

Just my $0.02

Tuesday, 31 July 2018

SPFx: A Facebook Feed webpart with custom UI

So , this post might look trivial or pretty straight forward. However, it's not about how complex it could be from SPFx point of view I hate to break it to you guys It's really simple.

If you would like to aggregate social posts from various platforms in one single view using your own UI elements and design this post could be useful. Otherwise, it will be a complete waste of time :). So yes you'd better embed your Facebook feed via an iframe if you don't want to customize the UI or aggregate multiple social posts in one view.

Before we start you need to have the following artefacts:
  • A  Facebook App (go to ) and follow simple steps to have your app created
  • A Facebook page for testing purpose / or use an existing page that you have access to.

To play around with Facebook graph API and take a look of how the feed JSON object is structured; you can navigate to graph explorer tool at and try it out. The endpoint we are after is very simple which is a GET request to v3.1/{yourPageID}/feed

When you start playing around with the tool you will find out that there are two main parameters:
  1. Limit (which limits the number of posts) it will be added by default for you with value 10
  2. Fields (which dictates which fields you want to retrieve) if you didn't supply the fields you will have the default results set returned which is as below:
"data": [
      "created_time": "2018-07-23T04:49:18+0000",
      "message": "Sydney Global Office 365 Developer Bootcamp 2018",

  1. You will need the access token which is a bearer token passed part of the header of the request, for the purpose of this application you might choose one of two options:
    1. Use an application token --> this will require review of your app , see below what you get if you tried your app token
  "error": {
    "message": "(#10) To use 'Page Public Content Access', your use of this endpoint must be reviewed and approved by Facebook. To submit this 'Page Public Content Access' feature for review please read our documentation on reviewable features:",
    "type": "OAuthException",
    "code": 10,
    "fbtrace_id": "

  1. Use a page access token --> which will need you as an admin of the page to grant the app access to the page posts

The drawback of this option is that you will have a very short-lived access token and Facebook is no longer provide token that is never expires (offline access token since 2012 yup I'm referencing something that is deprecated for more than 6 years ago).

But there is still hope you can somehow extend your access token using the method documented here which will allow you to convert your short lived token (about 2 hours) into a long lived token which expires in 60 days

From SharePoint perspective we are going to create a simple webpart that has three properties
  • Limit
  • Page Id
  • Access Token
The webpart is very simple and straight forward, I didn't try to include any fancy styles and it's a single react component called FbPostList which is consists of list of FbPost components

The main issue here is , Do I want the site owner to update the access token every 60 days, it's sounds very irritating and utterly stupid to be honest :) so I've built an azure function that runs every 30 days and it is responsible of updating the access token which I've stored in storage account, the SharePoint webpart now is responsible to read the access token that is stored in storage table. also make sure to store your Facebook app secret somewhere safe (Azure key vault)

Happy Sharepointing !

Sunday, 27 May 2018

How MS Flow displays SharePoint Online Taxonomy field values

If you have Office 365 tenant and didn't play around with MS flow, I believe you're missing out on something big here. You can use Microsoft flow to basically do everything, yes I said it "everything" I might be exaggerating but I know a friend who basically almost doing everything using MS Flow. If you are in doubt just follow @johnnliu  on twitter and see for yourselves.

And it's free, yes for only 2000 runs per month but for a productivity tool , who want more than that :) and  If you are looking to do some integration work, I suggest you go for Azure Logic apps , which is the underlying platform of Microsoft Flow. if you want to abuse Microsoft flow you can add more runs as cheap as 8 cents per additional 100 runs.

let's dive through the topic of this blog, in order to help you review everything in action, we will start by creating a new flow, for simplicity we will use HTTP trigger so we can trigger the flow using simple GET request.

The action we will be performing here is to update SharePoint file properties, to make this work you will have to connect to existing SharePoint online library that has been enriched with a taxonomy field as an attribute of a custom content type driven from the parent "Document" content type.

The main question I had here is how flow is retrieving the classification values because I wanted to select the value based on a specific parameter and I want to piggyback on what MS flow already knows.
If you trace the XHR requests in the browser developer console, you will be able to find mulitple request to specific endpoint similar to the following URL:

This request will allow MS flow to load all the selected document library properties and that include all the lookup, taxonomy fields and many other properties

One particular property is the URL to the taxonomy field values which is the same url with /entities/{some_guid} added to it as a suffix.

If we trace the request we will be able to extract the Authorization Bearer header which contains typical JWT token properties including:
Issuing Authority
AppId                         6204c1d1-4712-4c46-a7d9-3ed63d992682

If we navigate to Azure portal and search for this app in the enterprise apps under Azure AD we will find Microsoft Flow Portal app which strangely enough have no permissions visible through azure portal.

And that's how Microsoft flow display your SharePoint taxonomy fields values within SharePoint update file properties action.

Sunday, 22 April 2018

Thoughts on PowerBI anonymous sharing link

So today we will explore in a very brief post how Power Bi generates the anonymous sharing link, It all started when I created an anonymous sharing link which looked like the following:{some key}

so the part I was curious about is the generated key which looks like a base64 encoded string, I decoded the value and I get this JSON string representation

Just from this JSON string representation, I can somehow understand two values which are the GUID values, k is referring to "key" which is the sharing key of the report which I discovered to be unique per report. The most obvious value was the "t" value which refers to your tenant. to be even more precise it's the azure active directory "Directory ID" value. you can always double check by logging into your azure portal and check your azure Active Directory ID in the properties page

the "c" value doesn't really resonate with anything I thought it's somehow refers to the report Id. although one might argue that the key GUID is enough as it will uniquely identify the report. However, I tried to change it encode the object and open it in the browser. Interestingly enough I had my report opened. Firstly, I thought the browser might cache a certain value so It tried a completely different browser and it worked as well!

What I did next was even more interesting so I removed the whole "c" property and guess what it also worked, so I have no clue what this "c" value refers to!

one interesting thought will be whether the sharing key is enough for security reasons, and given that it's the GUID is globally unique as it uses device Id and timestamp in addition to some other things , (algorithm, uniquifier) for more information regarding GUID structure you can refer to this blog post

That's bring us to the end of this brief post.