Monday 12:00 am
0 minutes wasted
Distraction
  • Push-notifications began as a mechanism for alerting people they had new mail in their inbox
  • As app engagement and time-online became a commercial driver, push-notifications were used to persuade increased app usage
Result
  • Push-notifications compete for user attention
  • Information overload
  • Limited end-user control
  • Smartphone addiction
  • Increased anxiety, FOMO, NOMOPHOBIA
Oops

People receive, on average, approx. 60 notifications per day

Oops

People dismiss as many notification as they open

Oops

Personalised & Persuasive notifications = higher CTR%

Oops

Persuasive Features perform best when predicting whether a user will open or dismiss a notification

Persuasion

In our work*, Cialdini's 6 principle's of persuasion are extracted from push-notifications to scrutinize how designers may be manipulating us into opening notifications. Cialidini's principles are as follows:

  • Authority (P1)
    People follow and respect requests made by an authority
  • Scarcity (P2)
    People will place higher value on something which is rare
  • Liking (P3)
    People will follow what they like
  • Social Proof (P4)
    People will do what they see their peers doing
  • Commitment & Consistency (P5)
    People tend to follow through on their word and uphold behaviours associated with their own self-image.
  • Reciprocity (P6)
    People feel obliged to return a favour

* Full results to be presented at Persuasive Technology 2019

Entity

Each bubble above represents a Notification entity and each entity contains a number of features, such as the app from which it was sent, the subject of the content, the time of day it was received and so on. In order to use this information to train and implement intelligent algorithms (such as common Machine Learning algorithms), the Notification entity needs to be converted to machine readable format i.e. a vector of numerical values.

One-Hot Encoding

Categorical data types, such as the app or subject would subsequently need to be converted to numerical representations via One-Hot Encoding, thus vastly increasing the vector size, depending on the number of unique values per categorical feature.

Embedding Model

As an alternative to One-Hot Encoding the features of the Notification entity, an embedding model can be learned which can convert the Notification entity to a vector with fewer dimensions (than a One-Hot Encoded vector) by training it on a simple supervised ML task. For example, the task of predicting the action a user takes on a notification (open/dismiss) given the notification and context features.

The neat aspect of the embedding model is that it not only converts the Notification entity to machine readable form, but it also learns to represent each notification in a meaningful way such that notifications which prompt similar behaviours are closer in the multi-dimensional vector space. This occurs because the vector dimension is reduced - the model is forced to learn higher-level relationships between the features as there are not enough dimensions for each individual feature value.

Embeddings Visualised

The result can be viewed above by expanding the cluster of notifications. The position of each bubble is a notification entity's corresponding embedding value (embeddings are converted to 2-dimensional space for display purposes).

The visualisation above highlights how the model learned to cluster notifications by app, and further by the day of the week and time of day. Note also that notifications generated from app's such as Whatsapp, Snapchat and SMS (i.e. messenger apps) are close together in the vector space, suggesting the model identified these type's of notification to be similar in nature!

Monday 12:00 am
0 minutes wasted

Reinforcement Learning

As a potential solution for notification management, reinforcement learning methods are tested on existing in-the-wild and subsequent synthetic data sets.

There are many features which are known to have an association with the user's action on a notification, however the number of features selected also expands the state-space and impacts training/prediction times. For the purposes of this preliminary study, only the following features (which are known at delivery time) were selected:
{ category, app, time-of-day, day-of-week }

Q-Table

A Q-Table is implemented to learn to mediate notification delivery on behalf of the user to aleviate information overload.

Example Result:
  • Trained on 3,866 notifications, tested on 429 using 10-Fold Cross-Validation
  • Accuracy: 78%
  • Precision: 78%
  • Recall: 81%
  • F1: 80%
  • Time to Train: 70s
  • Time to Test: 39s

Deep Q-Network

A Deep Q-Network (DQN) is implemented to learn to mediate notification delivery on behalf of the user to aleviate information overload.

Example Result:
  • Trained on 3,866 notifications, tested on 429 using 10-Fold Cross-Validation
  • Accuracy: 80%
  • Precision: 79%
  • Recall: 87%
  • F1: 83%
  • Time to Train: 257s
  • Time to Test: 19s

DQN Synthetic

A Deep Q-Network (DQN) is implemented, but this time trained and evaluated with Synthetic data.


Example Result:
  • Trained on 4,500 notifications, tested on 500 using 10-Fold Cross-Validation
  • Accuracy: 79%
  • Precision: 74%
  • Recall: 90%
  • F1: 81%
  • Time to Train: 70s
  • Time to Test: 39s

Lacking creativity 🎨 or inspiration 💡?

EmPushy can analyse current trends 📈 in the media and generate empathetic 💜 notifications 🔔 based on topics relevant to you and your subscribers.

We create the message ✍ to send, you concentrate on creating that content 🎬 !

Conditional Text Generation

As a potential solution for notification management, reinforcement learning methods are tested on existing in-the-wild and subsequent synthetic data sets.

GPT2

The model generating the text in the demo above is a fine-tuned version of GPT2.

Clickbaiting & Topic

The fine-tuned GPT2 model can also be conditioned to generate certain types of natural language given a prompt. This is powerful as it allows for greater control over what is generated allowing for specific use cases e.g. a sports app can generate sport related notification content. In the demo above, you may select from a range of topics to generate notification text.

We are also interested in features of text which are enticing. The clickbaiting feature is indicative of persuasive language, thus we can condition the model on this to generate persuasive texts. The options are clickbaiting, which should be enticing, and non-clickbaiting, which should be more neutral.

Emojis

Emojis have been shown to vastly improve CTRs of notifications. They not only catch the eye but also allow the reader to quickly grasp nuances of the message quicker - such as the sentiment or the topic. For the demo above, we use BERT to represent the text as context vectors and use the similarity between vectors to select appropriate emojis. You can select either:

  • Summary
    The whole text generated by GPT2 is converted to a single context vector and summary emojis are generated which best represent the text as a whole. The emojis are appended at the end of the text
  • Keyword
    Keywords are extracted from the text generated by GPT2 and each is converted to a context vector. The best representative emoji is then selected for each keyword and added directly after the keyword within the text

EmPushy can generate tailored notifications 🔔 for your website which are both high-performing 💲 and empathetic 💜 toward your subscribers.

You can concentrate on creating great content 🎬 while EmPushy ensures it is delivered to subscribers in the most friendly 😊 and effective 💯 way possible!

Web Notification Creation

Autonomous creation of web push-notifications to aid app creators and digital marketers push the right message at the right time to the right audience.

GPT2

The model generating the text in the demo above is a fine-tuned version of GPT2.

Clickbaiting & Topic

Emojis