Thursday, 9 May 2019

SharePoint Online: Measuring SharePoint Modern Experience Performance


In this post I'll try to share my experience on how to measure performance of SharePoint Online modern experience and most importantly what you should expect and communicate to your client. Specially, a client that is moving form a fully branded customised on-premises intranet to SharePoint Online Modern Experience.

First, you need to identify what is the metrics you will use to assess the intranet performance and for this post I'll try not to get sidetracked and talk about accessibility and other non-functional aspects of your intranet platform.

There are many metrics that could be used but as we are putting a SaaS platform to the test I will ignore any server related performance metrics as we can't optimise server performance by any means,.
Of course you can always check the x-sharepointhealthscore custom header value and contact Microsoft support if you are not happy with your tenant performance, but for this post I'll focus on client things that is mostly affected by your client machine and browser.


There are many tools that you can use to measure the performance of a website, however if you are using Chrome browser (please don't tell me that you are using edge or worse internet explorer , even Microsoft has given up on them). by launching the developer tool you can easily launch an audit tool which allows you to perform a full audit for the current website. The tool is called lighthouse, that why I decided to stick with the metric which google use to score a website performance:

  • Time to first Contentful Paint
  • Time to first Meaningful Paint
  • Time to interactive 
  • Time to CPU Idle
  • Speed Index
  • Estimated Input Latency


That's good , we can easily execute the audit but how can we automate this? the answer is very simple. the good lads at google has built us a CLI for lighthouse than can be installed using npm , please check the GoogleChrome lighthouse repo for more information on how to install and run it.

In short you will be able to run the following command line which only execute a performance audit

As you can see the script is very simple , I start with getting the URLs of the SharePoint Online pages I want to test from a CSV file which also have the required numbers of runs.

Then I use the lighthouse command flags to ensure that there is not throttling or emulation (--disable-device-emulation --throttling-method=provided) also I'm exporting the audit run output as JSON file in a specific output folder

I also passes the --disable-storage-reset switch to ensure that I can use the browser cache , another flag to mention is --only-categories=performance which only execute the performance related audits


When I started running the report I get an amazing results, however when I looked at the trace using the lighthouse reoprt viewer  I found out that I'm redirected to login page which doesn't have much hence the amazing 100/100 score.


I looked over ways how to pass my login credentials but I found an easy way which is running chrome in debug mode


this will launch a new chrome window , I'll navigate to the SharePoint Online URL and login using my user credentials so this browser session will have my user logged in already. Afterwards, I need to pass the resulted port number when to the lighthouse cli.

It's kinda lazy solution but it worked OK for me but it is definitely better to run chrome headless.

The results

After running the PowerShell script I have a number of JSON files and I want to get the average value for the above metrics, do I need to write another PowerShell script. hmm  I don't think so, I'm a very lazy person. and I have a mongo db instance installed on my laptop. I imported the files to a collection then run the following script to get the average results


To be honest the results for an empty OOTB team Google gave the OOTB team site a performance score of 86/100 which is a cool score but as you might see, it's all about JS execution time and main thread work.


If your client is after an intranet solution with speed index is less than 3 seconds, You might need to consider building your intranet as a loosely-coupled intranet which is a concept that we have discussed here more than 3 years ago, you can check more details in the following post:
http://www.sharepointtweaks.com/2016/01/officedev-the-new-intranet-loosely-coupled-approach.html

If you are happy with the current performance and want to customise SharePoint Online and start building SPFx extensions and webparts. You need to be very careful and very cautious of what you use as every bit of JavaScript will matter and at some point, you will have to tell your client yes I can do this put it will slow your site down.

In the next post I'll list some techniques that helped me from lifting up a 50/100 scored custom SharePoint Online intranet to be comparable as the OOTB  with a score of 86/100.

Till next time

Sunday, 17 March 2019

SPFx: Modal Dialog, show classic SharePoint forms


Remember Model Dialog, that's surely brings back some memories , it was easy back in the days a simple way to instantiate a new SP.UI.ModalDialog  with the appropriate options and then show the dialog and that's it!

Of course we had to make sure that  we append the isDlg=true to the url querystring when showing a SharePoint form.

Also we used to make sure that header and footer are adhering to the branding guideline so when we customize the heck out of the SharePoint master page (prior to SharePoint 2013) we didn't get the funny header and footer ruining our Dialog box.

The other bit which was happening for us is when the SharePoint form (whether it's item display or edit) when we press cancel or OK buttons, the Modal dialog disappear magically.

So let's take SPFx, how can we replicate the same functionality with the simplest possible approach

BaseDialog & DialogContent to the rescure

BaseDialog is an abstract class wrapped and delivered to us as part of @microsoft/sp-dialog  package, by simply extending this dialog class and implement render method you can construct the content of your dialog (there are heaps of other method to extend and customise the behavior of the dialog but this post is about the simplest Iframe dialog ever)

In our case,  let's implement a new react component called IframeContent , which acts as a container for our iframe. This simple react component contains a single root DialogContent component which is imported from office-ui-fabric-react package. The IframeContent component has a single child element which is the iframe HTML tag.


The second component is even simpler , which is the Dialog itself which  implements the BaseDialog the most significant bit of code is the render method which basically does nothing apart from having a single instance of the previously created dialog content.


So far so good, but what if we need to replicate the magic we used to have (hiding the dialog upon clicking ok or cancel of an OTB classic SharePoint form).

The answer is simply using the same old mechanism , the old forms send an event to the parent window called "CloseDialog" so what we need to do is simple, let our React component listen to the event and call the close method.

Full code of the IframeDialog component is below
A more comprehensive implementation with different building blocks can be found at the following github repo https://github.com/SharePoint/sp-dev-fx-controls-react




Monday, 11 February 2019

SharePoint Online: What really happens when you click unfollow/follow site buttons



So, I'm back for the first post this year after quite a break, I can't believe it's 2019 already and Dubai 2020 Expo is only one year away. I don't live in Dubai anymore but I remember thinking of 2020 as the far future.
without further ado , let deep dive in this blog post topic:

What happens really when you unstar or star a SharePoint online site on the SharePoint home page . I presumed - naive me- that endpoint call to follow APIs is triggered, but as my naivety has been proven many times before, specially when I thought that modern news webpart using Search Analytics to display view count (turns out to get it from an endpoint  https://{your-region}.sphomep.svc.ms), read more about this here


Similarly, follow and unfollow website use similar endpoint.

Firstly, let's see when we unstar an already followed site what happened. A POST request will be fired as below


This request has the usual header information in addition to that a Bearer token which looks like the below after decoding the base64 and remove the signing bits at the end.

The function is used to update the followed site status is called sendSiteFollowingUpdateRequest, and takes three arguments, the first one is an object contains whether the site is followed or not, and the site card item information.


Next, let's try to understand how the aforementioned bearer token has been obtained, by looking at the session storage I can locate that the same token is saved as "ms-oil-datasource-SpHomeApiDataSource" in the session storage as below:


by going through the code I can see that it has been obtained by a simple POST request to the endpoint _api/SP.OAuth.Token/Acquire with the proper digest value.




maybe one day we will have full documentation for the sphome.svc.ms webservices and what kind of first party functionality is been exposed there.

these finding is only accurate at the time of the writing of this blog as these are not publicly available versioned APIs, use them at your own discretion and preferably not outside of a POV.


Ciao

Friday, 14 December 2018

It's a wrap: Office Development Bootcamp 2018


For the second year in a row I had the pleasure to organise Office Development bootcamp along with my good fellow MVPs Ashish Trivedi & John Liu. This year was a bit special for me as we didn't only deliver the usual three bootcamps we had last year (Sydney, Melbourne and Auckland) but we have added three additional cities to the mix:
  • Brisbane: Organised by my fellow MVP @ChrisGecks 
  • Hong Kong: my first attempt to remotely organise an event and It was awesome turn out of 66 people , great job by Microsoft HK
  • Kuala Lumpur: second attempt to organise something remotely and just very close to the perfection of HK event
I personally can't wait for next year's event and potentially add at least three more cities.

Sunday, 11 November 2018

Yo Teams: Running local https server


I've been using Teams Yeoman generator for quite a long time , also I've one contribution to this awesome opensource project.
However, I've always wondered why it's running on local http while the manifest is requiring the tabs endpoint to be an https endpoint, so If you are building a Microsoft Teams tab, you won't be able to run it locally without enabling https to your local server.
The method I used to get around this is using ngrok and use the https endpoint, but deep inside I didn't want to expose my local tab code to run externally, maybe someone out there is trying out all the possible ~4.2 billion possible sub-domains 16^8 (nah just kidding). I think I was just determined to run tabs locally using https server.

So I've updated my fork with the latest updates since my last contribution (almost a year ago), then did the following steps to make the generator creates https local server
  1. I've create new local branch and called it https (very creative name !)
  2. I've noticed I need to generate certificate and private key using openssl command (you can either install openssl.exe win 32/64 binary or run openssl through Ubuntu on windows 10  if you have a windows machine
  3. I've placed the two files in a cert folder under the main app template which is common across all Yo Teams artefacts
  4. I've used a webpack plugin "copy-webpack-plugin" to allow me to copy the cert folder
  5. To make sure that webpack will spit out my two files I've added the following line to the webpack server entry under plugins, don't forget to install copy-webpack-plugin package 

  6. Now to the fun and easy part, which is changing the server.ts class, first change import * as http from http to "https" and then replace the server creation with the following lines
  7. to run this version of the generator , run npm link
  8. now link your folder with the new generator, run it and when you type gulp serve, an https server will be created and you can use local server to run


Sunday, 14 October 2018

Yo Teams: Azure App services Deployment error



It has been a while since I played around with generator-teams (Yeoman generator for Microsoft Teams), it was almost 6 months ago since I demoed the capabilities of this amazing open source project. This time I've created a quick project which includes a simple tab, my intention was to run it locally and also publish it to azure app services.

I was very straight forward process publishing the Teams tab to Azure app service using local Git repository and push my master branch to it. However, this time it wasn't the easy ride I expected.

I won't go through the abvious steps which is setting up your environment for nodejs development and installing the latest Yo Teams package (2.5 at the time of writing this post)
long story short, I created a new teams app with only a simple tab and created a new azure app service and added local git as a deployment option so I can push my code to it and achieve a very simple deployment to azure app services.

After the awesome generator created my artefacts, I've run local npm install and gulp build and it was rocking, everything was working fine locally. I initialised my git rep, added azure git as a remote (I called it azure very creative !), then I've pushed my master branch to azure. I was waiting for the magic successful deployment message but instead I get the following error


apparently by default the node and npm version used by app services are v0.10.40 and v1.4.28 which are relatively old and caused some errors in npm.

Using App settings you can set the nodejs version for the app service instance but I couldn't find a switch for npm which was the one actually causing the error above, so I decided to use another way to specify the node & npm versions by adding them to package.json file as below:

made a minor update and pushed the new version to azure app service local git repo, and yet I stumble upon another error


I created a new folder called dist under wwwroot so the script should be able to create the issnode.yml file , I thought that's it however I find another error this time in the gulpfile.js syntax.


so the reason this time is node version is used to run the scripts is the same old version so I had to add a new line to deploy.cmd just before the build command line.
and finally, my deployment is successful



and what supposed to be a one minute job turns out to be half an hour job.

Tuesday, 21 August 2018

SPFx: How OTB News Webpart displays ViewCount


In this blog I'll explain how the out of the box news webpart displays the view count for each promoted site page aka "News Pages". At the first glance when a colleague asked me I answered naively by
"Oh Check ViewCountLifeTime managed property" thinking -silly me- that modern pages webpart will somehow uses the same technique that old publishing pages used to embrace, which has viewCountLifeTime, viewsLastNDays  managed properties which used to give us a lot of options to choose precisely what we want to display.

Back then we weren't worried about where SharePoint stores these values as it will all be enriched via search pipeline, which was easy, awesome and just works.

When my colleague tried out the search with some of the classical ViewCount managed properties, they got nothing which left me scratching my head and honestly questioning my sanity. I was a bit away from hands-on role and by a bit I mean it was almost more than four years since I've been rolling my sleeve and coding stuff as I'm currently doing mainly high-level activities. With no further do I decided to look under the hood of of OTB news webpart

First thing I noticed after viewing the bundled code using pretty format feature in google chrome in the file sp-news-webpart.bundle_en-us_someguid.js when the component is about to mount a function called "updateRealNewsItems" is called, this method takes a set of items (which is the news items with the view count ) as a parameter.



With a bit of more digging by searching the "viewCounts" I manage to find an expression that verified whether the view count is needed or not which depends on the display template that you choose.  If the view count is needed firstly the code tries to update view count dataprovider information then calls refresh data.


The code continues with preparing a request , getting the view count , cache it so next time it will try to get it from the cache then add the view count to the news items which is returned by search.

That kept me wondering is it that hard to enable search pipeline to update the ViewsCount to accommodate the Site Page content so we can rely on viewCount managed property. I really have no answer to that maybe it is. Additionally I believe that if modern webparts are available on GitHub it will make SharePoint developer life easier.

Just my $0.02