Making service workers easy so that your app is fast and reliable, even offline.

Related tags

Job Queues tulo-js
Overview

tulo.js

Making service workers easy to use so that your app can be fast and reliable, even offline.

Welcome to tulo.js, a service worker library that allows you to implement caching strategies via the powerful Service Worker browser API to make your website more robust.

The current version of tulo.js supports the following functionality:

  • Configure caching strategies for different files (markup, stylesheets, images, fonts, etc.) based on your business needs
  • Sign in to tulojs.com to monitor caching activity from your deployed website for each resource/file including average load times, resource size, and user connection types (e.g. 4G, 2G, Offline)

Thanks for checking out our library! Please let us know of any feature requests or bugs by raising a GitHub issue.

Getting Started

Installation

  1. Run npm i tulo-js in your project's root directory to install the tulo-js npm package.

Add a service worker

  1. Run touch service-worker.js in your project's public/ directory (or wherever you store static assets) to create the service worker file. You could call this file sw.js (or whatever you like) if you prefer a shorter name.

  2. If you are using Express.js to serve your front-end, create an endpoint to respond to GET requests to /tulo that sends node_modules/tulo-js/tulo.js as a response. Otherwise, adjust your import statement in the next step to node_modules/tulo-js/tulo.js in the next step instead of /tulo (see below).

  3. At the top of service-worker.js, import the tulo library:

  // Use the below import statement if you set up an Express endpoint
  import { cacheGenerator } from '/tulo';
  // Otherwise, import the library from node_modules
  import { cacheGenerator } from 'node_modules/tulo-js/tulo.js';

If you are having trouble importing tulo-js from node_modules, run the command below in your terminal from the root directory to copy the library functionality into your client-side code. To learn more about service worker imports, check out Jeff Posnick's article on limitations of ES Module imports in service workers.

cp ./node_modules/tulo-js/tulo.js ./public
  1. Add a version number to service-worker.js. Remember to update this version number whenever you make updates to this file. This will ensure that a new service worker is installed then activated and your caches are automatically refreshed when you update your caching strategy.
  const version = 1.0; // update version number when you change this file to register changes
  1. Develop a caching strategy for each of your website's resources (i.e. pages, stylesheets, images, logos, fonts, icons, audio/video, etc.). For example, you might want your pages to be requested fresh from the network whenever possible, so your caching strategy would be NetworkFirst. A NetworkFirst strategy will retrieve the resource from the network and add it to the cache. If the network fails or the server is down on a subsequent request, the resource will be served from the cache as a fallback. That way, if your users go offline, they can still access your pages from the cache if it has been populated on previous requests. That is the magic of service workers! Here are the caching strategies currently supported by tulo.js:
  • NetworkFirst: Requests resource from the network, serves response to user, and adds resource to the specified cache. If the network request fails – either due to a faulty/offline connection or a server error – the service worker will check the cache for that resource and serve it to the client if found
  • CacheFirst: Checks caches to see if the requested resource has already been cached, and serves it to the client if so. Otherwise, requests resource from the network and stores it in the specified cache
  • NetworkOnly: Requests resource from the network and serves response to user. If the network request fails, a message is sent in response that the resource could not be found
  1. For each unique caching strategy (e.g. a caching strategy for images), write a cache specification in service-worker.js. Sample code for caching your images is provided below. See step 8 for a boilerplate cache spec you can copy and paste into your service-worker.js file.
  const imageCacheSpec = {
    name: 'imageCache' + version,
    types: ['image'],
    urls: ['/logo.png', '/icon.png', 'banner.png'],
    strategy: 'CacheFirst',
    expiration: 60*60*1000
  };
  1. Here is a boilerplate cache spec you can copy and paste in your file:
  const sampleCacheSpec = {
    name: 'sampleCache' + version, // give your cache a name, concatenated to the version so you can verify your cache is up-to-date in the browser
    types: [], // input HTML MIME types e.g. text/html, text/css, image/gif, etc.
    urls: [], // input any file paths to be cached using this cacheSpec
    strategy: '', // currently supported strategies are: CacheFirst, NetworkFirst, NetworkOnly
    expiration: 60*60*1000 // in milliseconds - this field is OPTIONAL - if omitted, these urls will be refreshed when the service worker restarts
  }
  1. At the bottom of service-worker.js, add all your cache specifications into an array, and pass it as an argument to the cacheGenerator function.
  // If you have multiple cacheSpecs for different file types, include your page/markup caches first followed by images, stylesheets, fonts, etc.
  cacheGenerator([pagesCacheSpec, imageCacheSpec, stylesCacheSpec, fontCacheSpec]);

Register Service Worker

  1. In your project's root file, add the below code snippet to register your service worker. If you are running a React app, this would be in your top-level component (i.e. App.jsx or index.jsx). If you are creating a project with static HTML pages, add this snippet in your root HTML file (i.e. index.html) at the bottom of your body tag within an opening <script type="module"> tag and a closing </script> tag.
  if (navigator.serviceWorker) {
    await navigator.serviceWorker.register('service-worker.js', {
        type: 'module',
        scope: '/'
      })
      // To ensure your service worker registers properly, chain then/catch below - feel free to remove once it is successfully registering
      .then((registration) => console.log(`Service worker registered in scope: ${registration.scope}`))
      .catch((e) => console.log(`Service worker registration failed: ${e}`));
  }

Check your service worker and caches in DevTools

  1. Serve your application. Open up Google Chrome and navigate to your website. Open up your Chrome DevTools by clicking inspect (or entering cmd+option+I on Mac, ctrl+shift+I on Windows). Navigate to the Application panel and click Service Worker on the sidebar. You should see a new service worker installed and activated.

  2. Click on Cache Storage in the Application panel sidebar under Cache. Here you should be able to see each cache you created in service-worker.js and the files stored in them.

Sign in on tulojs.com for monitoring and insights

  1. Visit tulojs.com/dashboard to monitor your caching strategies in production. You'll be able to view the caching strategies you implemented on a per resource basis, including statistics on cache events and your users. For example, what percentage of the time is your site's logo image being fetched from the cache versus the network? What is the difference in average load time when it is fetched from the cache versus the network? What percentage of your users are accessing your about page when their connection is offline?

Notes & Resources

  • Service Workers only work with HTTPS (localhost is an exception)
  • web.dev has many fantastic articles on service workers, caching, and more – check out the overview on workers to get started
  • Workbox is a robust library for service worker implementation if you are interested in diving deeper on caching possibilities (it served as an inspiration for making tulo.js as a lightweight library with monitoring insights)
  • serviceworke.rs is a great website with a cookbook for service workers if you want to get your hands dirty building from scratch
You might also like...

Build and deploy a roadmap voting app for your porject

Build and deploy a roadmap voting app for your porject

Roadmap Voting App You can deploy Roadmap application yourself and get feedback from your users about your roadmap features. See the live example. In

Jan 3, 2023

generate statistics on the number of audience minutes your site is generating, and if readers make it to the end of your screeds

generate statistics on the number of audience minutes your site is generating, and if readers make it to the end of your screeds

audience-minutes generate statistics on the number of audience minutes your site is receiving, and if readers make it to the end of your screeds. “If

Dec 28, 2022

Out of the box modern User Interface, so you can see and manage your Workhorse jobs in realtime

Out of the box modern User Interface, so you can see and manage your Workhorse jobs in realtime

WORKHORSE UI Out of the box modern User Interface, so you can see and manage your Workhorse jobs in realtime. Start local Run npm i Copy and name prox

Apr 15, 2022

⚡️ Supercharge your ViewComponent development process 🚀

⚡️ Supercharge  your ViewComponent development process 🚀

L 👀 kbook ⚡️ Supercharge your ViewComponent development process 🚀 About Lookbook provides a ready-to-go UI for navigating, inspecting and interactin

Dec 26, 2022

A Remix.run stack to monitor your BullMQ queues

A Remix.run stack to monitor your BullMQ queues

Remix Matador stack A bold interface that helps you monitor your BullMQ queues. Learn more about Remix Stacks. $ npx create-remix@latest --template nu

Dec 15, 2022

Adds clap button (like medium) to any page for your Next.js apps.

Adds clap button (like medium) to any page for your Next.js apps.

@upstash/claps Add a claps button (like medium) to any page for your Next.js apps. Nothing to maintain, it is completely serverless 💯 Check out the d

Nov 23, 2022

A bot that notifies you on Slack whenever your company/product is mentioned on Hacker News. Powered by Vercel Functions & Upstash.

A bot that notifies you on Slack whenever your company/product is mentioned on Hacker News. Powered by Vercel Functions & Upstash.

Hacker News Slack Bot A bot that notifies you on Slack whenever your company/product is mentioned on Hacker News. or deploy your own Built With Vercel

Jan 3, 2023

Job queues and scheduled jobs for Node.js, Beanstalkd and/or Iron.io.

Ironium Job queues and scheduled jobs for Node.js backed by Beanstalk/IronMQ/SQS. The Why You've got a workload that runs outside the Web app's reques

Dec 14, 2022

Bree is the best job scheduler for Node.js and JavaScript with cron, dates, ms, later, and human-friendly support.

Bree is the best job scheduler for Node.js and JavaScript with cron, dates, ms, later, and human-friendly support.

The best job scheduler for Node.js and JavaScript with cron, dates, ms, later, and human-friendly support. Works in Node v10+ and browsers, uses workers to spawn sandboxed processes, and supports async/await, retries, throttling, concurrency, and graceful shutdown. Simple, fast, and lightweight. Made for @ForwardEmail and @ladjs.

Dec 30, 2022
Comments
  • Service Worker file cannot be imported directly into project from node_modules

    Service Worker file cannot be imported directly into project from node_modules

    Since service workers are written in vanilla javascript, modular specifiers for importing the library does not work since the browser does not have access to the node_modules directory. So the library needs to be copied into the root directory and the service worker needs to import the static endpoint. If there is a way to leverage a transcompiler or a bundler to load the library in a way the browser can access it while still using modular specifiers, then it would be a big win.

    opened by abitshaken 0
  • Response not returning in MongoDB for larger batch sizes

    Response not returning in MongoDB for larger batch sizes

    Bottleneck in MongoDB can slow down metrics processing esp if large batches are sent at once - sometimes the requests also get seemingly stuck. The current work around is to keep batch size small like 10-15 metrics. If we don't await on the metrics submission in the client, the submission might fail and we may end up deleting the metrics locally.

    opened by abitshaken 0
Owner
OSLabs Beta
OSLabs Beta
Use Cloudflare Workers Cron Triggers to keep your Hetzner Cloud Firewall allowing the latest list of Cloudflare IPs, or any other lists!

Hetzner Cloud Firewall automation with Cloudflare Workers Heavily inspired by xopez/Hetzner-Cloud-Firewall-API-examples, this repository holds a Cloud

Erisa A 9 Dec 17, 2022
Build your Cloudflare Workers with esbuild.

build-worker Bundle your Cloudflare Worker with esbuild instead of webpack. (It's ridiculously faster!) Wrangler v1 uses webpack. Wrangler v2 is using

Rom 7 Oct 24, 2022
Example repo for getting NextJS, Rust via wasm-pack, and web workers all playing nicely together.

Example of integrating WASM & web workers with a Typescript NextJS project. Running yarn yarn dev Open localhost:3000 Layout Rust code is in ./rust, g

Studio Lagier 63 Dec 23, 2022
Store and Deliver images with R2 backend Cloudflare Workers.

r2-image-worker Store and Deliver images with Cloudflare R2 backend Cloudflare Workers. Synopsis Deploy r2-image-worker to Cloudflare Make a base64 st

Yusuke Wada 62 Jan 3, 2023
slash-create with Cloudflare Workers template

/create with Cloudflare Workers A slash-create template, using Cloudflare Workers. Getting Started Cloning the repo You can either use degit to locall

Snazzah 13 Jan 3, 2023
Airtable + Cloudflare Workers URL Shortener 🌤

tableflare Airtable + Cloudflare Workers URL Shortener ?? Quick Start Airtable Generate your Airtable API key from your account dashboard: https://air

Griko Nibras 10 Oct 16, 2022
Redirect requests of current origin to another domain with Service Worker.

Service Worker to Redirect Origin This is a tool for your static website which could intercept all GET requests of the origin domain and redirect them

Menci 9 Aug 28, 2022
A fast, robust and extensible distributed task/job queue for Node.js, powered by Redis.

Conveyor MQ A fast, robust and extensible distributed task/job queue for Node.js, powered by Redis. Introduction Conveyor MQ is a general purpose, dis

Conveyor MQ 45 Dec 15, 2022
A simple, fast, robust job/task queue for Node.js, backed by Redis.

A simple, fast, robust job/task queue for Node.js, backed by Redis. Simple: ~1000 LOC, and minimal dependencies. Fast: maximizes throughput by minimiz

Bee Queue 3.1k Jan 5, 2023
Cache is easy to use data caching Node.js package. It supports Memcached, Redis, and In-Memory caching engines.

Cache Cache NPM implements wrapper over multiple caching engines - Memcached, Redis and In-memory (use with single threaded process in development mod

PLG Works 49 Oct 24, 2022