open-source implementation of the Turborepo custom remote cache server.

Overview

Turborepo Remote Cache

GitHub package.json version Build Docker Pulls

This project is an open-source implementation of the Turborepo custom remote cache server. If Vercel's official cache server isn't a viable option, this server is an alternative for self-hosted deployments. It supports several storage providers and deploys environments. Moreover, the project provides "deploy to " buttons for one-click deployments whenever possible.

Index

Supported Storage Providers

  • Local filesystem
  • AWS S3
  • Azure Blob Storage (WIP)
  • Google Cloud Storage (WIP)
  • Google Drive Blobs (WIP)

ENV VARS

  • NODE_ENV: String. Optional. Possible values: development | production. Default value: production.
  • PORT: Number. Optional. Default value: 3000.
  • TURBO_TOKEN: String. Secret token used for the authentication. The value must be the same one provided for the token parameter of the build script. See Enable custom remote caching in a Turborepo monorepo for more info. This value should be private.
  • LOG_LEVEL: String. Optional. Default value: 'info'
  • STORAGE_PROVIDER: Optional. Possible values: local | s3. Default value: "local". Use this var to choose the storage provider.
  • STORAGE_PATH: String. Caching folder. If STORAGE_PROVIDER is set to s3, this will be the name of the bucket.
  • S3_ACCESS_KEY: String. Used only if STORAGE_PROVIDER=s3
  • S3_SECRET_KEY: String. Used only if STORAGE_PROVIDER=s3
  • S3_REGION: String. Used only if STORAGE_PROVIDER=s3
  • S3_ENDPOINT: String. Optional. Used only if STORAGE_PROVIDER=s3. NOTE: This var can be omitted if the other s3 vars are provided.

Deployment Environments

Enable custom remote caching in your Turborepo monorepo

To enable a custom remote caching server in your Turborepo monorepo, you must add a config file by hand. The turbo login command works only with the official Vercel server.

  1. create .turbo folder at the root of your monorepo
  2. create config.json file inside it, and add these properties:
    • teamId: it could be any string that starts with "team_". This property will be used as a cache storage folder for the current repository. Ex. team_myteam
    • apiUrl: address of a running turborepo-remote-cache server.

For example:

.turbo/config.json

{
  "teamId": "team_FcALQN9XEVbeJ1NjTQoS9Wup",
  "apiUrl": "http://localhost:3000"
}
  1. Modify your Turborepo top-level build script, adding the --token= parameter. Note: The token value must be the same used for your TURBO_TOKEN env var. See ENV_VARS section for more info.

For example:

package.json

//...
  "build": "turbo run build --token=\"yourToken\"",
  "dev": "turbo run dev --parallel",
  "lint": "turbo run lint",
  "format": "prettier --write \"**/*.{ts,tsx,md}\""
//...

Deploy on Vercel

The server can be easily deployed as Vercel Function using the deploy button.

Note: Local storage isn't supported for this deployment method.

Deploy with Vercel

Deploy on Docker

You can find the image on the dockerhub.

  1. create an .env file, containing all of the env vars you need. Check ENV_VARS for more info.
NODE_ENV=
PORT=
TURBO_TOKEN=
LOG_LEVEL=
STORAGE_PROVIDER=
STORAGE_PATH=
S3_ACCESS_KEY=
S3_SECRET_KEY=
S3_REGION=
S3_ENDPOINT=
  1. run the image using the .env file created on the step one.
docker run --env-file=.env -p 3000:3000 fox1t/turborepo-remote-cache

Deploy on DigitalOcean

The server can be easily deployed on DigitalOcean App Service.

Note: Local storage isn't supported for this deployment method.

Deploy to DO

Contribute to this project

  1. clone this repository

    git clone [email protected]:fox1t/turborepo-remote-cache.git

  2. cd turborepo-remote-cache

  3. npm i

  4. cp .env.example .env

  5. put your env vars to the env file. See ENV_VARS section for more details.

  6. npm run dev

Comments
  • Cache not being utilized when called through Docker build

    Cache not being utilized when called through Docker build

    I've successfully cloned the turborepo-remote-cache and I've hooked it up in AWS. Everything is working fine, up until I try to use the cache from docker. Here's my dockerfile:

    FROM docker.artifactory.moveaws.com/node:16 as build
    COPY . /app
    WORKDIR /app
    RUN yarn install
    RUN yarn build
    RUN npx turbo run build
    

    Both RUN yarn build and RUN npx turbo run build are not updating the cache. Originally I thought that my docker instance couldn't communicate with the deployed service, but when I added a CURL inside the container, I could verify that outbound calls are properly triggered.

    My question is, has anyone run into this? Maybe someone has an example of a dockerfile already out there? Or what could possibly be the issue that the commands won't trigger the cache during the build phase?

    question 
    opened by mkotsollaris 18
  • Guidance for running as AWS Lambda

    Guidance for running as AWS Lambda

    Looking at the implementation, it does feel like it can be deployed anywhere, however, the S3 read/writes might be tricky – especially for large files, as Lambdas are essentially short-lived.

    Do you folks have any guidance/tips on how to effectively run this on AWS Lambda?

    opened by paambaati 10
  • publish on npm?

    publish on npm?

    for folks wanting to run this locally between worktrees, not needing docker or needing to clone + compile would be a big help.

    just a lil npx turborepo-remote-cache or something would be great :D <3

    opened by NullVoxPopuli 9
  • Feat/Application Default Credentials GCP auth

    Feat/Application Default Credentials GCP auth

    From docs:

    ADC is a strategy used by Cloud Client Libraries and Google API Client Libraries to automatically find credentials based on the application environment, and use those credentials to authenticate to Google Cloud APIs. When you set up ADC and use a client library, your code can run in either a development or production environment without changing how your application authenticates to Google Cloud services and APIs.

    ADC searches for credentials in the following locations: GOOGLE_APPLICATION_CREDENTIALS environment variable User credentials set up with the Google Cloud CLI The attached service account, as provided by the metadata server

    By simply not providing credentials to Storage() constructor we instruct it to use one of the above three ADC methods. The 2nd and 3rd scenarios are most common in my experience - SDK knows where to find local credentials, and it will also check the GCP metadata server for the Service Account information (this is used for Workload Identity auth in GKE - the recommended way to authenticate with GCP. Very similar to AWS's IAM roles for service accounts/IRSA)

    A slightly different approach to an existing PR https://github.com/ducktors/turborepo-remote-cache/pull/66: *Use ADC if no explicit credentials were provided. Many tools that use GCP SDK follow this logic.

    bug 
    opened by emalihin 7
  • feat: head support

    feat: head support

    Turbo repo introduce support for dry-run to inspect caches and report on whether a cache entry in the local and/or remote cache exists for a given hash. In order to leverage, remote cache implementors must implement a HEAD route to check for the resource. This PR enables this.

    See the following for references..

    https://github.com/vercel/turborepo/issues/1437 https://github.com/vercel/turborepo/pull/1988

    and this article for use cases on leveraging this new info reported by dry-run.

    https://medium.com/@sppatel/maximizing-job-parallelization-in-ci-workflows-with-jest-and-turborepo-da86b9be0ee6

    enhancement 
    opened by sppatel 7
  • feat | added support for AWS_SESSION_TOKEN

    feat | added support for AWS_SESSION_TOKEN

    Description

    This PR adds support for the AWS_SESSION_TOKEN variable, which is used when using shortlived AWS IAM credentials. The logic added will only add this variable if the value is defined, otherwise it will be skipped. This should hopefully not cause any issues for installations not using the sessionToken.

    I did not find any tests related to connecting to AWS S3 so I wasn't able to improve test coverage; should I try and investigate how to test this particular feature or can I just leave the PR as is?

    Docs

    Implemented based on these SDK docs.

    enhancement 
    opened by wdalmijn 7
  • Large build output cannot be uploaded

    Large build output cannot be uploaded

    I found that artifacts from the end of my build pipeline (just before deploy) are not cached.
    Bit of a show-stopper as it causes a full redeploy!

    It looks like Fastify has a default body size limit of 1 MiB, and so I'm speculating this is the reason.

    Could we have a configuration option to bump this considerably higher?

    A backport to the last stable version (so 1.1.2 I guess) would be really helpful, BTW.

    enhancement help wanted 
    opened by asquithea 7
  • feat: add support Application Default Credentials for google-cloud-storage

    feat: add support Application Default Credentials for google-cloud-storage

    Tihs PR adds setting to use Application Default Credentials for authentication of google-cloud-storage.

    reference:

    • https://cloud.google.com/docs/authentication/application-default-credentials
    • https://googleapis.dev/nodejs/storage/latest/Storage.html#Storage-examples
    enhancement 
    opened by nakatanakatana 6
  • feat: how to run in aws lambda

    feat: how to run in aws lambda

    In this PR:

    A guide on how to run the turborepo-remote-cache server in an AWS Lambda Function.

    Credit to @cpitt for a starting point on the handler code in #28.

    This is dependant on #86 - as this change is required to allow the Lambda's temporary credentials to access the S3 bucket.

    Issues reference:

    • #28
    documentation enhancement 
    opened by tom-fletcher 5
  • Error (412 Precondition Failed) when creating artifact

    Error (412 Precondition Failed) when creating artifact

    On deploying to Vercel and using S3 storage, I get this error when trying to run turbo commands –

    {
    	"severity": "WARNING",
    	"level": 40,
    	"time": 1655122987166,
    	"pid": 9,
    	"hostname": "169.254.33.177",
    	"reqId": "q-nGx8L1TCaa2laKhCeJRA-25",
    	"data": {
    		"message": "Access Denied",
    		"code": "AccessDenied",
    		"region": null,
    		"time": "2022-06-13T12:23:07.166Z",
    		"requestId": "KRFCEX6FM17HZD2Z",
    		"extendedRequestId": "xYHj4oepC5Eu4oD0QcrQcTGCbOyITRXV7rp2JSXa1RwAVSOKsiI0b2gkABewiDxQXa7GN+8RUGM=",
    		"statusCode": 403,
    		"retryable": false,
    		"retryDelay": 26.181109834484474
    	},
    	"isBoom": true,
    	"isServer": false,
    	"output": {
    		"statusCode": 412,
    		"payload": {
    			"statusCode": 412,
    			"error": "Precondition Failed",
    			"message": "Error during the artifact creation"
    		},
    		"headers": {}
    	},
    	"stack": "Error: Error during the artifact creation\n    at Object.handler (/vercel/path0/src/plugins/remote-cache/routes/put-artifact.ts:29:31)\n    at processTicksAndRejections (node:internal/process/task_queues:96:5)",
    	"type": "Error",
    	"message": "Error during the artifact creation"
    }
    

    I've made sure the environment variables are correctly set up, the IAM credentials are right, and that the user has full adminstrator access on AWS.

    opened by paambaati 5
  • `/v8/artifacts/events` returns 404 and doesn't store data locally

    `/v8/artifacts/events` returns 404 and doesn't store data locally

    When running locally I can't get my repo to actually cache the results.

    Setup in turborepo-remote-cache:

    yarn build
    NODE_ENV=development TURBO_TOKEN=\"123\" yarn start
    

    setup in test-turbo-reop

    .turbo/config.json

    {
        "teamId": "team_FcALQN9XEVbeJ1NjTQoS9Wup",
        "apiUrl": "http://localhost:3000"
    }
    

    package.json

    {
        ...
        "scripts":{
                "build:turbo": "turbo run build --token=\"123\"",
                ...
        }
    }
    

    Log output from turborepo-remote-cache

    {"severity":"INFO","level":30,"time":1654715502319,"pid":83236,"hostname":"MacBook-Pro-2.local","message":"Server listening at http://0.0.0.0:3000"}
    ^[[A{"severity":"INFO","level":30,"time":1654715546148,"pid":83236,"hostname":"MacBook-Pro-2.local","reqId":"vmy6wThJSLCcGWwTxwgjpg-0","req":{"method":"POST","url":"/v8/artifacts/events?teamId=team_FcALQN9XEVbeJ1NjTQoS9Wup","hostname":"localhost:3000","remoteAddress":"127.0.0.1","remotePort":59506},"message":"incoming request"}
    {"severity":"INFO","level":30,"time":1654715546151,"pid":83236,"hostname":"MacBook-Pro-2.local","reqId":"vmy6wThJSLCcGWwTxwgjpg-0","message":"Route POST:/v8/artifacts/events?teamId=team_FcALQN9XEVbeJ1NjTQoS9Wup not found"}
    {"severity":"INFO","level":30,"time":1654715546155,"pid":83236,"hostname":"MacBook-Pro-2.local","reqId":"vmy6wThJSLCcGWwTxwgjpg-0","res":{"statusCode":404},"responseTime":5.764100074768066,"message":"request completed"}
    {"severity":"INFO","level":30,"time":1654715546156,"pid":83236,"hostname":"MacBook-Pro-2.local","reqId":"vmy6wThJSLCcGWwTxwgjpg-1","req":{"method":"POST","url":"/v8/artifacts/events?teamId=team_FcALQN9XEVbeJ1NjTQoS9Wup","hostname":"localhost:3000","remoteAddress":"127.0.0.1","remotePort":59508},"message":"incoming request"}
    {"severity":"INFO","level":30,"time":1654715546156,"pid":83236,"hostname":"MacBook-Pro-2.local","reqId":"vmy6wThJSLCcGWwTxwgjpg-1","message":"Route POST:/v8/artifacts/events?teamId=team_FcALQN9XEVbeJ1NjTQoS9Wup not found"}
    {"severity":"INFO","level":30,"time":1654715546156,"pid":83236,"hostname":"MacBook-Pro-2.local","reqId":"vmy6wThJSLCcGWwTxwgjpg-1","res":{"statusCode":404},"responseTime":0.36331605911254883,"message":"request completed"}
    

    Is there something that I am missing with having to register a team so that it doesn't 404 with the server or a configuration step that I missed in the docs?

    enhancement good first issue 
    opened by StevenMatchett 5
  • Cache server not working

    Cache server not working

    Hope anyone that reads this is having a great day!

    I only have the following line in the Docker Container logs, and no other log appears when I use the remote cache.

    54896/11/01 01:17PM 30 Server listening at http://0.0.0.0:3000 | severity=INFO pid=6 hostname=turborepo-remote-cache
    

    How do I know when I run the npm script build command, the action is importing the cache, and/or exporting it?

    Is the STORAGE_PATH environmental variable correctly used along with volumes?

    The following contains the contents of the TurboRepo Remote Cache docker-compose.yml file.

    version: '3.9'
    services:
    
      turborepo-remote-cache:
        image: fox1t/turborepo-remote-cache:1.8.0
        container_name: turborepo-remote-cache
        hostname: turborepo-remote-cache
        environment:
          - NODE_ENV=production
          - TURBO_TOKEN='xxx'
          - LOG_LEVEL=debug
          - STORAGE_PROVIDER=local
          - STORAGE_PATH=/tmp
        volumes:
          - /mnt/appdata/repo-team/turborepo-remote-cache/tmp:/tmp
        ports:
          - 3535:3000
        networks:
          - proxy
    
    networks:
      proxy:
        driver: bridge
        external: true
    

    Originally posted by @NorkzYT in https://github.com/ducktors/turborepo-remote-cache/discussions/84

    opened by NorkzYT 13
  • Health check without auth header

    Health check without auth header

    πŸš€ Feature Proposal

    Create a (or modify the existing) health check endpoint that is not protected with the authorization header.

    Motivation

    Without this, it's not possible to scale the service behind an AWS ALB with a health check as ALB doesn't support passing headers.

    Example

    AWS ALB health checks

    opened by jazmon 2
Releases(v1.10.1)
  • v1.10.1(Dec 20, 2022)

  • v1.10.0(Dec 14, 2022)

  • v1.9.0(Dec 13, 2022)

  • v1.8.0(Nov 12, 2022)

  • v1.7.4(Nov 5, 2022)

  • v1.7.3(Nov 5, 2022)

  • v1.7.2(Nov 4, 2022)

  • v1.7.1(Nov 3, 2022)

  • v1.7.0(Oct 14, 2022)

  • v1.6.7(Oct 11, 2022)

  • v1.6.6(Sep 27, 2022)

  • v1.6.5(Sep 15, 2022)

  • v1.6.4(Sep 15, 2022)

  • v1.6.3(Sep 15, 2022)

  • v1.6.2(Sep 15, 2022)

  • v1.6.1(Sep 15, 2022)

    What's Changed

    • Fix docker build issue by @tehKapa in https://github.com/fox1t/turborepo-remote-cache/pull/42

    Full Changelog: https://github.com/fox1t/turborepo-remote-cache/compare/v1.6.0...v1.6.1

    Source code(tar.gz)
    Source code(zip)
  • v1.6.0(Sep 14, 2022)

    What's Changed

    • feat: add google-cloud-storage provider by @dlehmhus in https://github.com/fox1t/turborepo-remote-cache/pull/38

    New Contributors

    • @dlehmhus made their first contribution in https://github.com/fox1t/turborepo-remote-cache/pull/38

    Full Changelog: https://github.com/fox1t/turborepo-remote-cache/compare/v1.5.1...v1.6.0

    Source code(tar.gz)
    Source code(zip)
  • v1.5.1(Jul 15, 2022)

  • v1.5.0(Jul 15, 2022)

    What's Changed

    • feat: adds suport for POST artifacts/events call by @fox1t in https://github.com/fox1t/turborepo-remote-cache/pull/35

    Full Changelog: https://github.com/fox1t/turborepo-remote-cache/compare/v1.4.0...v1.5.0

    Source code(tar.gz)
    Source code(zip)
  • v1.4.0(Jul 15, 2022)

  • v1.3.0(Jul 13, 2022)

    What's Changed

    • fix: fixes vercel deploy and adds better handling for s3 defaults en… by @fox1t in https://github.com/fox1t/turborepo-remote-cache/pull/33

    Full Changelog: https://github.com/fox1t/turborepo-remote-cache/compare/v1.2.0...v1.3.0

    Source code(tar.gz)
    Source code(zip)
  • v1.2.0(Jul 13, 2022)

    What's Changed

    • Adds support for cli by @fox1t in https://github.com/fox1t/turborepo-remote-cache/pull/24
    • build(node): upgrades node from v14 to v16 by @tehKapa in https://github.com/fox1t/turborepo-remote-cache/pull/25
    • Use built-in AWS credentials loading system by @dobesv in https://github.com/fox1t/turborepo-remote-cache/pull/22
    • feat: multiple tokens from env by @lodmfjord in https://github.com/fox1t/turborepo-remote-cache/pull/27

    New Contributors

    • @dobesv made their first contribution in https://github.com/fox1t/turborepo-remote-cache/pull/22
    • @lodmfjord made their first contribution in https://github.com/fox1t/turborepo-remote-cache/pull/27

    Full Changelog: https://github.com/fox1t/turborepo-remote-cache/compare/v1.1.1...v1.2.0

    Source code(tar.gz)
    Source code(zip)
  • v1.1.2-alpha.0(May 23, 2022)

    What's Changed

    • Adds support for cli by @fox1t in https://github.com/fox1t/turborepo-remote-cache/pull/24
    • build(node): upgrades node from v14 to v16 by @tehKapa in https://github.com/fox1t/turborepo-remote-cache/pull/25
    • Use built-in AWS credentials loading system by @dobesv in https://github.com/fox1t/turborepo-remote-cache/pull/22

    New Contributors

    • @fox1t made their first contribution in https://github.com/fox1t/turborepo-remote-cache/pull/24
    • @dobesv made their first contribution in https://github.com/fox1t/turborepo-remote-cache/pull/22

    Full Changelog: https://github.com/fox1t/turborepo-remote-cache/compare/v1.1.1...v1.1.2-alpha.0

    Source code(tar.gz)
    Source code(zip)
  • v1.1.1(May 19, 2022)

    What's Changed

    • chore(deps): bump github/codeql-action from 1 to 2 by @dependabot in https://github.com/fox1t/turborepo-remote-cache/pull/16
    • chore(deps): bump actions/checkout from 2 to 3.0.2 by @dependabot in https://github.com/fox1t/turborepo-remote-cache/pull/15
    • chore(deps): bump actions/setup-node from 2.5.1 to 3.2.0 by @dependabot in https://github.com/fox1t/turborepo-remote-cache/pull/21
    • chore(deps): bump docker/metadata-action from 3 to 4 by @dependabot in https://github.com/fox1t/turborepo-remote-cache/pull/19
    • chore(deps): bump docker/login-action from 1 to 2 by @dependabot in https://github.com/fox1t/turborepo-remote-cache/pull/23
    • chore(deps): bump docker/build-push-action from 2 to 3 by @dependabot in https://github.com/fox1t/turborepo-remote-cache/pull/18

    Full Changelog: https://github.com/fox1t/turborepo-remote-cache/compare/v1.1.0...v1.1.1

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(Jan 16, 2022)

    What's Changed

    • Add Vercel support by @tehKapa in https://github.com/fox1t/turborepo-remote-cache/pull/3
    • chore(deps): bump actions/setup-node from 2.4.1 to 2.5.1 by @dependabot in https://github.com/fox1t/turborepo-remote-cache/pull/2
    • build: updates dockerfile by @tehKapa in https://github.com/fox1t/turborepo-remote-cache/pull/4
    • docs: adds logo on readme by @tehKapa in https://github.com/fox1t/turborepo-remote-cache/pull/5
    • docs(README): fixes .env copy command by @bmuenzenmeyer in https://github.com/fox1t/turborepo-remote-cache/pull/7

    New Contributors

    • @tehKapa made their first contribution in https://github.com/fox1t/turborepo-remote-cache/pull/3
    • @dependabot made their first contribution in https://github.com/fox1t/turborepo-remote-cache/pull/2
    • @bmuenzenmeyer made their first contribution in https://github.com/fox1t/turborepo-remote-cache/pull/7

    Full Changelog: https://github.com/fox1t/turborepo-remote-cache/commits/v1.0.0

    Source code(tar.gz)
    Source code(zip)
Owner
Maksim Sinik
Cloud Software Architect. DevOps lover. Container enthusiast. Conference Speaker. @HospitalRun lead maintainer; @fastify core team
Maksim Sinik
sveltekit + turborepo + histoire in a turborepo

swyx's SvelteKit monorepo starter This is my starter for a monorepo with 2022 tech: SvelteKit Turborepo Histoire pnpm - brew install pnpm Demo Proof t

swyx 36 Nov 25, 2022
Kuldeep 2 Jun 21, 2022
an open-source package to make it easy and simple to work with RabbitMQ's RPC ( Remote Procedure Call )

RabbitMQ Easy RPC (Remote Procedure Call ) The Node.js's RabbitMQ Easy RPC Library rabbitmq-easy-RPC is an easy to use npm package for rabbitMQ's RPC

Ali Amjad 4 Sep 22, 2022
A custom Chakra UI component that adds ready-made styles for rendering remote HTML content.

Chakra UI Prose Prose is a Chakra UI component that adds a ready-made typography styles when rendering remote HTML. Installation yarn add @nikolovlaza

Lazar Nikolov 50 Jan 3, 2023
Projen project type for Turborepo monorepo setup.

?? projen-turborepo Projen project type for Turborepo monorepo setup. Getting Started To create a new project, run the following command and follow th

Roman 23 Oct 4, 2022
A Turborepo with Strapi v4 (w/ postgres database) + Next.js powered by docker and docker-compose

Turborepo + Strapi + Next.js + Docker A Turborepo with Strapi v4 (w/ postgres database) + Next.js powered by docker and docker-compose. ⚠️ Prerequisit

Elvin Chu 60 Dec 29, 2022
Fullstack Turborepo starter. Typescript, Nestjs, Nextjs, Tailwind, Prisma, Github Actions, Docker, And Reverse proxy configured

Turborepo (NestJS + Prisma + NextJS + Tailwind + Typescript + Jest) Starter This is fullstack turborepo starter. It comes with the following features.

Ejaz Ahmed 132 Jan 9, 2023
Blog and Resume template with turborepo

Comet-land BLOG DEMO RESUME DEMO Blog and Resume template with turborepo ν•œκ΅­μ–΄ λ¬Έμ„œλŠ” λ‹€μŒ λ§ν¬μ—μ„œ ν™•μΈν•˜μ‹€ 수 μžˆμŠ΅λ‹ˆλ‹€. Blog Feature ?? Code highlight with line-highlig

hyesung oh 115 Dec 27, 2022
Remix + Cloudflare Workers + DO + Turborepo

Remix + Cloudflare Workers + DO + Turborepo A starter to get you up and going with Remix on Cloudflare with all the latest and greatest. What's inside

Jacob Ebey 38 Dec 12, 2022
A starter template for Remix + Cloudflare Workers + DO + KV + Turborepo

Remix + Cloudflare Workers starter with Turborepo ?? Starter to get going with Remix and Cloudflare Workers. This template is based on the starter cre

Girish 27 Jan 2, 2023
Turborepo starter with pnpm

Turborepo starter with pnpm This is an official starter turborepo. What's inside? This turborepo uses pnpm as a packages manager. It includes the foll

Jack Herrington 23 Dec 18, 2022
Remix TypeScript monorepo with Turborepo pipelines, Prisma, PostgreSQL, Docker deploy to Fly.io, pnpm, TailwindCSS and Tsyringe for DI.

Remix template with Turborepo, TypeScript and pnpm. The remix app deploys to fly.io or build to Docker image. Example packages for Database with prisma, Tsyringe dependency injection, UI, and internal TypeScript packages.

Philippe L'ATTENTION 33 Dec 29, 2022
Cache Solidjs resources by key (derived from the resource source)

Solid Cached Resource (Experimental) Create a solid resource attached to a cached state by a unique key. Heavily inspired by react-query, but for soli

Yonatan 27 Dec 31, 2022
Reference for How to Write an Open Source JavaScript Library - https://egghead.io/series/how-to-write-an-open-source-javascript-library

Reference for How to Write an Open Source JavaScript Library The purpose of this document is to serve as a reference for: How to Write an Open Source

Sarbbottam Bandyopadhyay 175 Dec 24, 2022
An Open-Source Platform to certify open-source projects.

OC-Frontend This includes the frontend for Open-Certs. ?? After seeing so many open-source projects being monetized ?? without giving any recognition

Open Certs 15 Oct 23, 2022
Shikhar 4 Oct 9, 2022
This is a project for open source enthusiast who want to contribute to open source in this hacktoberfest 2022. πŸ’» πŸŽ―πŸš€

HACKTOBERFEST-2022-GDSC-IET-LUCKNOW Beginner-Hacktoberfest Need Your first pr for hacktoberfest 2k22 ? come on in About Participate in Hacktoberfest b

null 8 Oct 29, 2022
Open-source Fallout 2 engine implementation, a revival of darkfo project

Harold A post-nuclear RPG remake This is a modern reimplementation of the engine of the video game Fallout 2, as well as a personal research project i

Max Desiatov 8 Aug 19, 2022
An Open-Source JavaScript Implementation of Bionic Reading.

bionic-reading Try on Runkit or Online Sandbox An Open-Source JavaScript Implementation of Bionic Reading API. βš™οΈ Install npm i bionic-reading yarn ad

shj 127 Dec 16, 2022