Thursday, 7 December 2017

SharePoint Addin: VSTS CI/CD pipeline hosted agent challenge

In this post I will explain setting up a CI/CD pipeline using VSTS for SharePoint Addins without the need to install PnPPowerShell scripts on your Build/Release Agents.

By default the hosted VSTS Release agent doesn't include SharePoint online PowerShell cmdlets. The easy and straight forward option is to use your agent and install SharePoint PowerShell cmdlets on it. However, I want to have a more portable option that will allow me to use the hosted agent without maintaining a release VM.

PnP PowerShell cmdlets

Firstly, what is actually the underlying logic that PnP PowerShell cmdlets encapsulates. It's basically HTTP calls to SharePoint Online RESTful APIs. So in way we can replace the PowerShell Cmdlets with simple http requests.

Gulp to the rescue

By Default, VSTS hosted agent will have node and gulp installed so we don't need to worry about setting up VSTS hosted agent, we will build a gulp task that allow us to publish SharePoint Addin to our app catalog, the main steps will be:
  • Getting app principle
    In order to upload the app package to the app catalog we need to get app principle which will run in app-only mode check my post here to learn how to get client Id and client secret
  • Acquire access token
    Using sharepoint-apponly nodejs module we will be able to get the access token
  • Upload .app package to app catalog site.
  • First we will create a new file let's name it sharepoint.js, then let's import fs and http modules, we will create a single function uploadFile which will be exported to be used in our gulpfile.js 
    here is the sharepoint.js
    and the gulpfile.js

Putting it all together

  1. Let's create new directory , initiate new node module:
  2. Create SharePoint.js and gulpfile.js and paste our code there.
  3. Install needed dependencies which includes sharepoint-apponly module explained here
  4. Let's create new build definition which includes step to copy gulpfile.js /sharepoint.js and package.json to artefact directory
  5. In release definiton let's create two release steps. The first step will simply install npm dependencies 

  6. the other step will be gulp task which will run publish-app task defined in gulpfile.js , notice you can supply the parameters as argument which will be evaluated from release definition variables.

  7. the hosted agent can copy the app package to the appcatalogUrl which in my case defined in release variables.

Wednesday, 8 November 2017

OfficeDev: Register Custom Connector Teams vs Groups

In this post, I'll walk you guys through how the registration process of Office 365 connectors varies from Microsoft Teams (connector to specific channel) and Group connector for specific group conversations.
All office 365 connectors have a single endpoint to register the connector, which is can be accessed via the url ,you need to fill your connector information including an icon which will appear when the users configure it for either inbox , groups or even Microsoft Teams.

How you create a new connector is not the topic of this blog post, if you are interested to know how to create a connector you can refer to this MSDN article here.

However, today I'll walk you guys through creating new custom connector and side-loaded as Teams app.

  1. Using teams yeoman generator create a new teams app, if you want to learn how to run yo teams refer to the readme page of the generator-teams github repo
  2. Choose Connector from the generator options
  3. You will be prompt to provide the connector Guid, which you can get from connectors portal
  4. The generator will generate a sample Typescript code to run your generator then will run npm install to the current directory, followed by a success message
  5. Create an Azure app service to host the generator, alternatively you can use ngrok to host and run this connector locally, in my case I used an existing Azure app service 
  6. Create a local git repo for the azure app service
  7. Initilaize your local git repo and commit the changes
  8. Push your code to azure app service
  9. you will notice that deploy.cmd generated file will attempt to run npm install on the remote azure app service
  10. let's package our teams app manifest file to side-load it to our Teams client application
  11. Now using the Microsoft Teams client app choose any team and select the apps tab (if you see bots tab instead you need to enable side loading apps and switch to developer preview which is explained here)
  12. After sideloading our app which consists of a single connector, let's put the connector to the test by adding it to a channel within the team we sideloaded the app to, which can be easily achieved by selecting connectors from the channel drop down menu
  13. Sadly, the sideloaded connector appears at the end of the available connectors so you might scroll all the way till the end to find your newly added custom connector
When you click configure, a pop-up will appear to render the ***Connector.html page where you can replace *** with your connector name
Now we've reach to the highlight of this blog post and probably the reason that I wrote it in the first place. When you click the button in the above screenshot which labeled "Connect to Office 365" it sends a GET request with specific parameters to endpoint.

The request parameters are exactly the same whether you initiate the request  via the browser or from within Microsoft team client application.  However, the result is completely different. In the first scenario it will create a webhook for Office 365 group conversation and will prompt the user to select the targeted Office 365 group. In the second case a Teams channel webhook will be created with no further user input.
So how the endpoint correctly distinguish between the two different request originator, more importantly how it knows which team and which channel this webhook associated with

when I logged the request I noticed that there is two differences in the header of the requests, the teams request have a different user agent also it has an object called TeamsContext.

Now let's test the connector by sending an GET request to https://connectorURL/api/Connector/ping you will notice a message card with a single viewAction appears.

this is how the endpoint distinguish the two request and how you can easily build and host a custom Microsoft teams channel connector.

Monday, 21 August 2017

OfficeDev: Don't build your own notification engine, use the exiting one

blur, close-up, code
If you take a look at the Office 365 top navigation bar, you will notice a nice notification bell next to the gear icon. I've seen a lot of developers trying to re-invent the wheel and build their own notification user interface which eventually becomes very confusing for the end users. to be honest I've done that in the past but on SharePoint 2013 on premises and my excuse was there was no "out of the box" notification engine.

let's first take a step back and see how notification works in Office 365 "to be more precise in outlook web access".

If we take a sneak peek into the request sent while a native Office 365 page is loading you will find this interesting request to
outlook notification endpoint

Which describes a single type of notification performed by outlook web app (pulling notification). If you look more carefully into the requests sent you will also find a subscribe to notification POST request. Interestingly enough, it subscribe to five different types of notification events:
  • HierarchyNotification
  • ReminderNotification
  • NewMailNotification
  • SocialActivityNotification
  • SuiteNotification
And a sixth one,  with prefix "RowNotification trailed by a base64 encoded string which seems to be guid encoded (to be honest I didn't spend anytime trying to figure out what this guid represents)

To understand the various subscription models in outlook there is a great article at , a more recent read also can be found at

In order to use the built-in notification engine in Office 365 we need to create a new entity of the which the Office 365 web interface already subscribed to. However, we don't want to create a lot of unnecessary noise by having our notification objects lurking as annoying meeting request or emails.

After a quick thinking, I've chosen to create the notification as an outlook task, the good news is Microsoft graph already has a beta endpoint enables you to create an outlook task

 The trick here as you can guys see , is to enable reminder by setting two value isReminderOn to true and to set the reminderDateTime if you do that when the reminder time approaches you will have a nice popup notification using the native notification engine.
The task notification will appear in the second div in the notification roaster which is reserved to the reminders (both events and tasks)

The mechanics 

 Basically a periodic  post request to the url is being triggered to get any notification the outlook web app subscribed to using an earlier POST request

When a change occurs specially in the case of our task a subsequent call to to get the actual reminder items ,which will be renders as below
and Voila! you can use outlook tasks as custom notification item with no front end customization at all

Saturday, 5 August 2017

Basic botframework Yeoman generator

I think it's about time to talk about botframework generator I wrote around Jan this year (2017). since April, 2016 I'm fascinated by what Microsoft's botframework can do.  I wrote a three part blog post- describing in details how to build an office assistant utilizing both Microsoft botframework and LUIS (language understanding intelligent services), for more details have a look here.

Earlier this year I was trying to speed up the scaffolding process of bot development using botframework and to be more precise I was focusing on the nodejs version of the sdk. In a day or so I ended up building a basic Yeoman generator that generates three types of sample bots which similar to the same choices you will get using the azure bot service.

The generator is not a very sophisticated one, in a matter of fact it's a very basic "attempt" so far and I to start adding more templates. One template I think about is a bot that is Office365-aware bot.

The node package is published almost 7 months ago @ and it can be installed using

After the installation what you really need is to create a new directory and run the following command

Then you will be guided through simple steps to generate the skeleton of your bot, it will first ask you to specify the name of the bot then the author
Then you will have to provide the location of the generated files.

after that, here comes the main decision which type of bot you want, which is one of three templates, for simplicity we will choose an echo bot which is basically reply back exactly the user's message.

 After the generator is done with generating the seed of your bot you can either update it add your own logic and make it more than just basic skeleton or you can easily choose to run the generated code and connect to it using the emulator and start playing around with it. If you choose to run it this what exactly you will get.

The end-to-end process will take less than 3 minutes to install, generate and run your bot. I'm currently in the process to add more templates to the generator , any contribution is very appreciated
code can be found at

Wednesday, 12 July 2017

Event Driven Development in Office 365

The beginning: Event Receivers

Since the early days of SharePoint on-premises and we can easily register event receivers on the web, list and item levels to trigger custom actions when a particular event occurs. These abilities were moved along to SharePoint online where the custom action was hosted in a remote endpoint (Remote Event Receivers).
You can register event receivers based on the list template or item content type. Similar technique exists in Project Server and Project Online with a slight difference in naming (Event Handlers).  So basically we've been doing event driven development in SharePoint Online since the beginning of Office 365.

What's not good about Event Receivers

1.       Unlike SharePoint on-Premises remote event receivers is loosely coupled from SharePoint online and if the event receiver endpoint is down for some reason SharePoint online won't retry executing the event again with the exception of all App related remote event receiver.
2.       Remote event receivers and Project online event handlers are built as WCF endpoints so SharePoint online and Project online send SOAP message to these endpoint (not very portable huh!)

A Whole new world!

Microsoft is gradually replacing traditional remote event receivers by one of the following options:

Microsoft Flow

Easy to use tool with a nice user-friendly interface which allows super users and IT pros to build event driven scenarios with more than 446 templates and 85+ triggers including generic HTTP endpoint which opens the door for unlimited possibilities. Microsoft flows offer an admin interface via where the IT pros can build self-service flows using the browser interface.

Azure Logic Apps

Microsoft flows is built on top of azure logic apps. They both have the same designer and same list of connectors. Azure logic apps is the preferable option in B2B mission critical scenarios. Azure logic apps is managed like any other azure service via Azure portal. Flows can be designed using browser or Visual Studio using azure logic apps extension for visual studio
which requires Azure Resource manager SDK to be installed first

here is the designer within Visual studio 2015.


Webhooks is another option which also more suitable for developers and B2B applications. Webhooks gives you a mean to build an event driven application on a massive scale without the need to have an azure subscription. Webhooks introduced first within Microsoft graph outlook resources like mail, contact, and calendar. In addition to Microsoft Graph resources, it has been introduced to SharePoint as well.
You can create a subscription via a simple POST request

 Webhooks will enable you to build more complex solutions where the subscription or the flow-trigger can be created on the fly based on another trigger.

Friday, 23 June 2017

All hail the new Shiny Office Seller Dashboard

Finally and after almost an eight months wait since the first time I learned that office store Seller dashboard will be merged with other developer stores to become one single Microsoft Developer dashboard.

I was checking my Seller dashboard account as usual and I noticed that the new dashboard has been rolled out and now I can access all the apps in one central location.

My first interaction with the Seller dashboard was back in late 2012 when Office store was a beta version. I remember that I had two major issues when it comes to the provider/developer experience these issues caused me a great amount of frustration.

The first issue- If your add-in is free, you will have absolutely no idea who downloaded the add-in but if it's paid you can have a limited information extracted from the sales report which gives you a very simple tabular view of the sales transactions. This information includes only the following:
(Market, Country, State if the buyer within the US market and the local currency purchase amount)

There was no single way of providing any kind of information about the acquisition any kind of user contact details.

With the new Developer dashboard an additional option has been added to the add-ins. This option allows the add-in provider to store the lead information into a target system of choice by a simple click of a button (Edit Lead Configuration). This option is available for both Free and Paid applications.

The available targets for the lead information are:

  1. Dynamics CRM Online
  2. SalesForce
  3. Azure Table 
  4. Marketo
  5. AzureBlob 
To be honest seeing this option after all these years makes me super excited.  Now Office Add-in providers can use the store as a proper lead generation tool . They can be more proactive and contact application consumers. Understand more why conversion rate for a specific application is low,and seek a proper feedback in order to improve and provide a better service.

In addition to these benefits, having such mature platform will improve the quality of the add-ins listed on the store as the providers now can use it as a proper marketing tool.

The second bit- I always find the existing reports very basic and gives me only some limited metrics (view/download/purchase/trial) which spans only the current week and the past three weeks. There was no way  to see my add-in performance this quarter vs. same quarter last year unless I manage somehow to store the data somewhere else. 

The new dashboard has a new report called acquisitions which unfortunately, I couldn't make it to work (currently I'm getting a blank page) but I presume that this report will answer  many questions I have. If you still need access to the old reports you can access them by viewing the legacy reports.

Another exciting part is having "Teams App" as an additional app type that you can submit to the office store. Although,  in the Office store website there is no actual category for Teams Apps yet.


Friday, 16 June 2017

SharePoint webhooks: the good, the bad and the ugly

I'm  alway a fan of separation of concerns as it's simplify and abstract development. The introduction of the App model in SharePoint 2013 was a huge step forward to run my custom code in an isolated sandboxed environment where I can easily debug and troubleshoot.

Moving forward I started working with SharePoint online and I become a big fan of using SharePoint remote event receivers it gives me total control on how to build things, what I liked about remote event receivers that there is no retries, simply SharePoint Online is triggered the endpoint call to WCF service and doesn't really bother with any response back. It's totally up to you to build your fail over mechanism which can be an  ECM custom action to resend the same message again to the service endpoint.

The only exception to the no-retries limitation is the AppInstalled Remote Event Receiver which retries for 3 times before it gives up.

With SharePoint Webhooks, it's completely different

"The Good"- Your endpoint has to be verified at creation time

Some people including me would argue that it's a good practice to verify the endpoint at the creation time which enable you to make sure that you don't register an invalid URL

"The Good"- it's basically a HTTP POST request to the notification client endpoint

This one is a major advantage as it makes it easier to implement than the old WCF service endpoint which allows developers to build notification clients using their own tool of choice
**Although I've built a Remote Event Receiver endpoint using nodeJS and wcf.js

"The Bad"- You can't register the webhook for a specific library event
I see this as a big disadvantage, I need only to receiver notification when a specific event occurs to the resource (the list) I don't really care about other events, I don't need to receive all these noise from SharePoint online.

"The Bad"- You need to keep your webhook alive
for some bizarre reason, webhooks will expire after a period of time (6 months) so your application need to update the registered subscriptions and extend the activation.

"The Ugly"-In short the notification message  is basically  useless
the notification message  consist of one or more of the following notification object
As you can easily tell, you can only reconstruct the resource object, you have absolutely no clue why you got this notification which enable the resource (document library) to send a lot of unnecessary noise to your notification client. In order to get the changes, you have to call /GetChanges library endpoint to understand why you received the notification object and decide wether to act or not.

To be quite frank, Microsoft Graph webhooks is done in a very neat way, the only draw back is it expires within 70 hours.

Tuesday, 18 April 2017

Add more Smarts to your bot:Detecting emotions from giphy posts

I've been blogging about bots since April, 2016 which was about the time I discovered Microsoft's amazing botframework. I've written a series of how to detect user intent based on text messages using LUIS (Language Understanding Intelligent Service) you can find it here.

In this post I'll talk about understanding user emotion using the Giphy posts embedded within Microsoft teams. This can be generalized to any image communication between you and the bot.

First Let's get the Image Content

Using Microsoft teams insert Giphy functionality we can see that it adds an attachment to the communication message between the user and the bot, this attachment is a mere link to the chosen Giphy.

Let's add the image understanding capability to our bot

We will use emotion detection service which is a part of cognitive service to detect the giphy emotion which is basically a POST request to emotion detection service with the subscription key added as a "Ocp-Apim-Subscription-Key"  header

         url: '',
                method: 'POST',
                headers: {
                    'Ocp-Apim-Subscription-Key': '*Add Your Subscription key here*',
                    'Content-Type': 'application/json'
                body: {
                    'url': 'your Giphy image URL'
                json: true
            }, function (errresponsebody) {
                if (err) {
                //successful call

the successful response will contain an object with score of each possible emotion, I've created a function that returns the highest emotion and then send a message to the user to reflect the emotion detection.
function getHighScoreEmotion(body) {
    var val=0;;
    var emotion;
    if (body.length > 0) {
        for (score in body[0].scores) {
        return emotion;
    return null;

and now how it's look like when you send your bot a giphy :)

Now our bot can understand and response to the Giphys shared by the user 

Wednesday, 15 March 2017

Outlook Add-in Attach Files from Dropbox

In this post I will walk you through how to create a simple Outlook Add-in which will allow you to attach files from your dropbox account. What you will need is:

  1. Sublime or Visual Studio Code (No need for Visual studio)
  2. Office Generator follow instructions @ to make it works 
  3. Dropbox Account to create Dropbox App
  4. Client Library for dropnox API you can found it @ 
  5. Create an Empty directory and  run Yo Office
  6. follow the instructions as in my previous post here
  7. What we will create is the following:
    1. An Angular Service to read dropbox Directory 
    2. An Angular Directive to use the service and render the files
    3. Directive Isolated scope will have the following properties:
      1. Dir : represents the root directory default to "/"
      2. receiverUrl: redirect url (has to match one of the redirect urls in the dropbox developer app definition)
      3. clientId: dropbox app key
      4. template: url to the template
    4. The directive will use the DropboxAPI service to load the files initially from the root directory. 
    5. The directive will watch for any change in the "dir" property and we will use this to refresh the file list when clicking on a folder.
    6. This is How the Add-in works.
I'm polishing the code and it will be available soon on github happy Office Development.