Web push notification using Python

Building a web push should be straight forward. But while implementing it, I found the resources lacks and there is no straight forward guide to follow and implement it in your existing stack or form scratch. Which make sense as its still in early days. In this blog post, I would be covering step by step procedure to build a web push service. If you are implementing a push service from scratch or integrating it in your existing application, this blog post will help you to reach your goal. There will be following sections in this post:

  • How web push works
  • Building a push service
  • Browser supports
  • References

How web push works

On high level web push needs three parties/component to work. Those are:

  • Client side application: Get users permissions, get users subscription token and sends to the backend service.
  • Push Service: Validates push request coming from backend service and forward the push message to the appropriate browser.
  • Backend service: Persists users subscription information and initiate push sending.

Steps to send/receive push web push notification

  1. User accepts push permission and browser generate push subscription token via communicating with the Push API
  2.  Client app should send the subscription information to the backend service and backend service should be persisting the subscription information and use it to the next steps
  3. Backend push service initiate the push and send the payload to the specific push service (which is denoted in the users subscription information)
  4. Push service receives the push notification and forward it the specific user and browser display the notification

Backend service using python

We will be building a REST interface which will communicate with the client application and push service. It will store the subscription information of users and distribute VAPID  public key. VAPID is the short term for Voluntary Application Server Identification, the generated public key will be used via the client app.  We will need to develop following API endpoints:

  1. /subscription
    • GET – to get vapid public key
    • POST – to store subscription information
  2. /push
    • POST – will send push request to all users ( will be used for testing )

Generate the VAPIDs via following command:

openssl ecparam -name prime256v1 -genkey -noout -out vapid_private.pem
openssl ec -in vapid_private.pem -pubout -out vapid_public.pem

Create base64 encoded DER representation of the keys:

openssl ec -in ~/.ssh/vapid_private.pem -outform DER|tail -c +8|head -c 32|base64|tr -d '=' |tr '/+' '_-' >> private_key.txt

openssl ec -in ~/.ssh/vapid_private.pem -pubout -outform DER|tail -c 65|base64|tr -d '=' |tr '/+' '_-' >> public_key.txt

These VAPIDs keys will be used in the newly developed backend service. We will be using pywebpush library for sending the web push notification. We will be wrapping the push like below by using newly generated keys:

import os
from pywebpush import webpush, WebPushException


VAPID_PRIVATE_KEY = open(DER_BASE64_ENCODED_PRIVATE_KEY_FILE_PATH, "r+").readline().strip("\n")

"sub": "mailto:youremail"

def send_web_push(subscription_information, message_body):
    return webpush(

Full service code can be found here as gist. Follow the gist readme for details about running the service.

Frontend application to test the backend service

Rather than writting an application form scratch lets use google push example client app. Which you will find here.
Use the 08-push-subscription-change version which is last part of step by step tutorial from Google. Put the VAPID public key in main.js in this variable applicationServerPublicKey. Client side application will use the public key to generate the subscription information. And send the public key which will be used by Push Service.

Putting it all together

Meanwhile pull the whole code from gist, install necessary packages and run it via following commands.

pip install -r requirements.txt

python api.py

Run following command to get the VAPID public key from the service via following command:

curl -X GET

It will return the public key like as key value pair. Copy the public key and paste it to frontend application as value of applicationServerPublicKey in main.js. 

Navigate the browser to the Push lab application and click on “Enable Push Messaging”, a browser pop up will appear like below:

Click on “Allow” by which you will be giving permission to the application to show web push notification. And the client app will generate the PushSubscription object. Which we will need to send to the our backend service which will persists the information and use to send around a push notification.

It will generate the payload which should be sent to backend service via following curl request:

curl -X POST

The push will arrive right top of the screen something like below:

Browser supports push

While writing this blog post Chrome and Firefox only supports push. Here you can find latest supported browser lists.


On Mac while developing the backend service, openssl can throw exception while sending out push, cryptography library can not find appropriate version of openssl. Which looks like below:

Symbol not found: _EC_curve_nist2nid
Referenced from: /usr/local/opt/openssl/lib/libssl.1.0.0.dylib
Expected in: /usr/lib/libcrypto.dylib in /usr/local/opt/openssl/lib/libssl.1.0.0.dylib

To fix the issue we need to export openssl library path like below:

export DYLD_LIBRARY_PATH=/usr/local/opt/openssl/lib

Also I faced an issue with python cryptography library it can not find the right version of openssl and install cryptography based on inappropriate version.
To overcome that, I had to uninstall and install it again like below:

pip uninstall cryptography
LDFLAGS="-L/usr/local/opt/openssl/lib" pip install cryptography --no-use-wheel


  1. Google post details about how push works
  2. Firefox post web push API 

Building a micro service with Django, Docker.

If you never wrote a micro service before but you know what is a micro service is, this post will introduce you by writing a μ-service. As it is a new “Buzz” floating around for last couple of years. Read details.

Micro service architecture has definitely many advantages over monolithic application, on the other hand it depends on several factors whether it make sense to go with micro service architecture or not. If you want to read more details about Micro service pattern and its pros and cons, please check this post for details. Specially micro service “Pros”, “Cons” section.

Let’s not get into the debate and start writing some code. In this post we will be doing the following:

  1. Building a REST API using Django (DRF)
  2. Docerize the newly developed REST API and run it via uwsgi

Step 1: Building the REST API using Django:

We will be using Django REST Framework (DRF). The API will be exposing data for Event Management company (imaginary) where the company uses the API to manage their events and performer. For sake of simplicity in our API we will be able to able Add new performers and events. And there will be listing endpoint where we will be listing recent events and associated performers name.

So lets write some code:

Django REST framework made easier to develop REST API on top of Django, all one need to do define serializers and the load query objects via Django models and thats it. DRF will take care rest of the staff. As the API is minimum and we are doing CRUD, in the serializers we need to extend serializers. Model and thats is. Finally Views.py looks like below:

class EventViewSet(viewsets.ModelViewSet):
   queryset = Event.objects.all()
   serializer_class = EventSerializer

class PersonViewSet(viewsets.ModelViewSet):
   queryset = Person.objects.all()
   serializer_class = PersonSerializer






You can checkout the codebase from here.

Step 2: Dockerized μ-service:

Lets checkout the Dockerfile for details:

FROM python:2.7
RUN git clone https://github.com/mushfiq/djmsc.git djmsc
RUN pip install -r requirements.txt
RUN python manage.py migrate
RUN python manage.py loaddata data/dummped.json
CMD ["uwsgi", "–module=djmsc.wsgi:application", "–env=DJANGO_SETTINGS_MODULE=djmsc.settings", "–master", "–pidfile=/tmp/djmsc.pid", "–http=", "–socket=", "–buffer-size=32768"]

view raw
hosted with ❤ by GitHub

In the Dockerfile from Line 1-11 we are cloning the repo, updating the working directory, installing dependencies . From line 13-19 we creating db through manage.py, loading dummy data and running uwsgi to serve the API.

Let’s run the docker file like below:

docker-machine start default #starting docker virtual machine named default
docker build -t mush/djmsc . #building docker image from the Docker file
docker run -d -p 8000:8000 mush/djmsc #running the newly build docker image

List the IP of the docker machine and then make a cURL request to check whether the REST API is up or not
like below:

api=$(docker-machine ip default) #returns in which IP docker-machine is running
curl $api:8000/person/?format=json | json_pp

And it returns json response like below:


You can pull the docker image from here and start your own container 🙂

Good read: Building Microservices

Dockerize your golang application.

Playing with docker for last couple of projects. So far it’s good experience. In this post we will develop a REST API using Golang and then Dockerize the deployment. The REST API will be very basic with just a single endpoint. So we will cover the following section in the post:

  1. Developing minimal REST endpoint
  2. Coding a Docker file
  3. Running the Docker container with the API
  4. Finally will push the Docker image to a docker registry


Developing REST endpoint

For developing the REST API we are going to use Gin which is a HTTP web framework. It provides standard solutions/API for solving common problems using features like middleware support, Routing and standard convention of error management. But, sometime it make sense only using Go’s net/http can be used to develop any http based application but to avoid developing everything from scratch, Gin will give us a good starting point.

The REST API returns current time when in the route “/”. The code look like below:

package main
import (
func rootHandler(context *gin.Context) {
current_time := time.Now()
context.JSON(200, gin.H{
"current_time": current_time,
func main() {
router := gin.Default()
router.GET("/", rootHandler)

view raw
hosted with ❤ by GitHub

In the main function we are declaring the Router for the API and passing the handler function associated with the endpoint. And in the handler method we are getting current time and passing it to Gin context.

Creating the Dockerfile

For Dockerizing the REST API lets develop the Dockerfile. We will use Golang official docker image because its used by many other developers and again we don’t need to do all the work of choosing an OS, pulling Golang setting up environment further steps.

FROM scratch
ADD https://gist.github.com/257c227b9ed0586b216e5f25f3f7e588.git main /
ENTRYPOINT ["/main"]

view raw
hosted with ❤ by GitHub

In the Dockerfile we are using bare bone docker image scratch which is minimal base image.

In line 5, we are cloning the binary of the api.

In line 7 running the latest build and finally exposing the port of the docker container via EXPOSE 8080.

So far we developed the API with the endpoint and also coded the Docker file. Lets build the docker image and run the Docker container.

Running the docker container

In the local development machine we are using Docker-machine. So we need to follow following steps:

docker-machine start default
docker build -t mush/gondar .

docker build builds docker image (name of our docker image is gondar) for the container and then we need to run the image as a container with the following command:

docker images #to list current docker images
docker run -d -p 80:8080 mush/gondar

In line 2 we are running the newly build docker image and forwarding port from docker to host via -p. And we are good to go, our docker container is running and we can check it via a curl command like below:

api=$(docker-machine ip default) #getting IP of the docker machine host
curl $api:80

which returns current time as JSON response.

Pushing newly build Docker image

So the last step would be pushing the latest created Docker image to a registry, we will use docker hub.

First, we need to login to docker hub via following command:

docker login --username=yourhubusername --email=youremail@company.com

And then we need to push newly created Docker image via following command:

docker push mush/gondar 

That’s it, we just created a REST API and also developed Docker image for deployment it as a container based service 🙂

Further reading:

Project to watch:

Access key based authentication in DRF (Django REST Framework)

If you start developing a REST API, one of the fundamental requirements you will need to implement an authentication system. Which prevents any anonymous  user to expose your REST endpoint.

For developing REST API, I used to start from scratch by using Django/Flask, then I used Piston . And when the further development of Piston stopped, I started using Tastypie. Last year I was reading documentation of DRF and I realised, my next REST API I will develop on top of DRF. And since then I am using it. The documentation  is organised and it has a growing community around it.

So back to the point, in DRF you can have an access key based authentication system quickly without coding much configuration and code.

While authenticating an user via access key, the core idea is, we need to check whether there is any user exists with the provided access_key or not. And to return data or raising exception.

At the beginning, add a new file in your django app called “authentication.py“. To write custom authentication in DRF,  “BaseAuthentication” and then we need to override “authenticate” method. authenticate takes to django request object from which we will get the access key like request.get(“access_key”, None). The whole sub-class look like below:

from rest_framework import authentication
from rest_framework import exceptions
from apps.newspaper.models import Subscriber
class AccessKeyAuthentication(authentication.BaseAuthentication):
def authenticate(self, request):
access_key = request.GET.get("access_key", None)
if not access_key:
raise exceptions.NotFound("Access key not provided.")
user = Subscriber.objects.get(access_key=access_key)
except Subscriber.DoesNotExist:
raise exceptions.PermissionDenied("No User found with the access key")
except ValueError:
raise exceptions.ValidationError("Badly formed hexadecimal UUID string")
return (user, None)

view raw
hosted with ❤ by GitHub

And next step is to add it to our REST_FRAMEWORK settings in project settings (settings.py), like below:


To use it, we need to import it and apply it as a decorator like below:

from apps.newspaper.authentication import AccessKeyAuthentication
@authentication_classes((AccessKeyAuthentication, ))
def list_news(request):
   # your code goes here

And then call the endpoint like: /news?access_key=”ACCESS_KEY”. And it will return our REST output.

In this tutorial, in Subscriber model I have a field called which is “access_key”, you can use any other models/field for authentication checking.

This is the preferred way I mostly apply  authentication in DRF based REST API and then as the API grows I used to add more sophisticated authentication for the API. DRF also comes with token based authentication which is described in the docs briefly.

Further reading:
DRF Authentication Documentation


Enable CORS in bottle python

To access data of the REST API from other domain API should have CORS enabled for the website. Like most of all framework Bottle by default does not set CORS header. To enable it, following decorator can be used:

from bottle import Bottle,response
def allow_cors(func):
""" this is a decorator which enable CORS for specified endpoint """
def wrapper(*args, **kwargs):
response.headers['Access-Control-Allow-Origin'] = 'example.com' # * in case you want to be accessed via any website
return func(*args, **kwargs)
return wrapper
#example usages in an API endpoint
app = Bottle()
def get_cakes_by_id():
# loading cakes by ID
return {"cakes": cakes}

view raw
hosted with ❤ by GitHub

In the API response header “Access-Control-Allow-Origin” will be added. As per our example, it will be Access-Control-Allow-Origin: example.com.  To enable it for any website you can set it as “*”.   There is an interesting discussion whether to set it * or not.

Google places PHP API client.

Google have an API to provide data related place which is call “Google Places”. You can search to the api using different name of the location and also can append services (e.g Burger in Newyork) and it returns matched you keyword, location.

Last year I wrote a wrapper on PHP on top of the API to access it from PHP application.  Used Composer for the first time, as dependency manager for the wrapper.


To use it set the configuration (update the API key of you google place account) and you are ready to go.

require_once 'googleplaces.class.php';
$googlePlaces = new GooglePlaces(array(
'apiKey' => "YOUR_API_KEY"
$query['query'] = "Burger in Berlin";
$places = $googlePlaces->getLocationsByTextSearch($query);
$decoded_data = json_decode($places, true);

Feel free to fork it and contribute 🙂

Node.js script to make a zip archive

While working in a node.js project, I had an use case where user will have to query based on days and query result will be pdf filtered by  the date range. And I have to create a directory of pdf files and return it as a zip file for the user.

As there are many modules related to zip, I tried couple of the active modules but none was meeting my requirements. And some of those modules has different bugs/issues which were open at that time. After different try/error and going through most of the zip modules, I used archiver.  And below a sample how it worked.


view raw
hosted with ❤ by GitHub

npm install archiver;

view raw
hosted with ❤ by GitHub

var archiver = require('archiver'),
archive = archiver('zip'),
fs = require('fs');
var output = fs.createWriteStream(__dirname + '/mocks.zip');
var getStream = function(fileName){
return fs.readFileSync(fileName);
//these are the files, want to put into zip archive
var fileNames = ['mock1.data', 'mock2.data', 'mock3.data'];
for(i=0; i<fileNames.length; i++){
var path = __dirname + '/'+fileNames[i];
archive.append(getStream(path), { name: fileNames[i]});
archive.finalize(function(err, bytes) {
if (err) {
throw err;
console.log(bytes + ' total bytes');

view raw
hosted with ❤ by GitHub


After cloning the gist make sure you have files to zip and run the script like below:


Happy coding 🙂

Python script to download Google spreadsheet.

I like to automate tasks, I think every software engineer like that, right? After all thats our job. I wrote the following script for downloading google spreadsheet as csv. Just got it when I was going through my old code base, hopefully it would help someone else too.

To run the script you have to install gdata python module.

import os
import sys
from getpass import getpass
import gdata.docs.service
import gdata.spreadsheet.service
get user information from the command line argument and
pass it to the download method
def get_gdoc_information():
email = raw_input('Email address:')
password = getpass('Password:')
gdoc_id = raw_input('Google Doc Id:')
download(gdoc_id, email, password)
except Exception, e:
raise e
#python gdoc.py 1m5F5TXAQ1ayVbDmUCyzXbpMQSYrP429K1FZigfD3bvk#gid=0
def download(gdoc_id, email, password, download_path=None, ):
print "Downloading the CSV file with id %s" % gdoc_id
gd_client = gdata.docs.service.DocsService()
#auth using ClientLogin
gs_client = gdata.spreadsheet.service.SpreadsheetsService()
gs_client.ClientLogin(email, password)
#getting the key(resource id and tab id from the ID)
resource = gdoc_id.split('#')[0]
tab = gdoc_id.split('#')[1].split('=')[1]
resource_id = 'spreadsheet:'+resource
if download_path is None:
download_path = os.path.abspath(os.path.dirname(__file__))
file_name = os.path.join(download_path, '%s.csv' % (gdoc_id))
print 'Downloading spreadsheet to %s…' % file_name
docs_token = gd_client.GetClientLoginToken()
gd_client.Export(resource_id, file_name, gid=tab)
print "Download Completed!"
return file_name
if __name__=='__main__':

view raw
hosted with ❤ by GitHub


You have to run the script like below:

python gdooc.py spread_sheet_id#gid=tab_id

For example check the following screenshot:



And after downloading you will have the csv file in the same directory, currently the document id is being used as name of the csv file, you can change it as you want.

Happy Coding 🙂

Attaching new EBS volume in AWS EC2 instance.

Amazon AWS plus EC2 logo_scaled

In AWS, EC2 by default provide 8GB space, in a past project I had to extend the size of one of my development instance as the data was growing fast. From AWS console add new EBS volume. Then attach it to your instance by AWS console and log into you EC2 instance via ssh.

Run following command:

sudo fdisk -l

which will show list of volumes with the newly added volume as unpartitioned. Something like below:

Screen Shot 2012-09-19 at 1_up_2

Then next step is to build the file system of new EBS volume using unix command mkfs. Like below:

sudo mkfs -t ext4 /dev/xvdf

Screen Shot 2012-09-19 at 1_mkfs

Next you have to mount it in your desired path,  e.g. /mnt/ebs1. Run following command:

sudo mount /dev/xvdf /mnt/ebs1

Then add an entry into /etc/fstab. it would be something like this:

"/dev/xvdf  /mnt/ebs1 ext4 defaults 1 1"

There are facts if you add the EBS volume to your /etc/fstab and some how if there are  issue (like file system corruption, unavailability of zone  etc ) with the volume during booting the instance it will not be booted. Because while booting your system will look for the entry and when its not available the whole instance is down. Check AWS forum post for details.

And also check this whole SO discussion to resolve this issue in alternative way ( using a script for example).

Check following docs if you are more interested about the unix commands that used in this post.

fdiskmount and unmount  and mkfs.

Serving static files using Restify

I was working with Node.js for building a REST API. For REST API module I was using restify. The restify is a simple and yet powerful node module. One of the use case of the API was, I had to serve static file for specific routing. I went through the docs and tried different things but couldn’t figure yet out at first. After hustling for hours,  me and Christian started to go deep into it and figured it out!

So in my case the configuration was  something like this

server.get(/.*/, restify.serveStatic({
    'directory': 'static_content',
    'default': 'index.html'

When restify internally resolve the path, it looks for ‘static_content/index.html’.

I have coded a very basic application to show it works, sample application looks like below:


view raw
hosted with ❤ by GitHub

Serving html from restify!

view raw
hosted with ❤ by GitHub

"name": "7271849",
"version": "0.0.1",
"description": "A sample application that shows how to serve static file using restify.",
"main": "api.js",
"dependencies": {
"restify": "~2.6.0"
"devDependencies": {},
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"start": "node server.js"
"repository": {
"type": "git",
"url": "https://gist.github.com/7271849.git"
"keywords": [
"author": "Mushfiq-E Mahabub",
"license": "wtfpl",
"gitHead": "2c8e160f865cc11b2b09a5681b220c442a47ab17",
"bugs": {
"url": "https://gist.github.com/7271849"

view raw
hosted with ❤ by GitHub

var restify = require('restify');
var server = restify.createServer();
server.get(/.*/, restify.serveStatic({
'directory': '.',
'default': 'index.html'
server.listen(8080, function() {
console.log('%s listening at %s', server.name, server.url);

view raw
hosted with ❤ by GitHub

In this sample application the static content (index.html) is in the same root as server.js as why ‘directory’: ‘.’.

Use package.json to install necessary module and start playing 🙂