This is the TypeScript track, one of the many tracks on exercism. It holds all the exercises that are currently implemented and available for students to complete.
The track consists of various core exercises, the ones a student must complete, and each core exercise may unlock various side exercises.
You can find this in the config.json
.
It's not uncommon that people discover incorrect implementations of certain tests, have a suggestion for a track specific hint to aid the student on the JavaScript specifics, see optimisations in terms of the configurations of jest
, eslint
or other dependencies, report missing edge cases, factual errors, logical errors, and, implement exercises or develop new exercises.
We welcome contributions of all sorts and sizes, from reporting issues to submitting patches, as well as joining the current discussions 💬.
This guide covers several common scenarios pertaining to improving the TypeScript track. There are several other guides about contributing to other parts of the Exercism ecosystem, that are similar to this repository.
- Generic information about track-contributing
- The JavaScript track
- The JavaScript Analyzer
- The JavaScript Test Runner
- The TypeScript Analyzer
- The TypeScript Test Runner
Help us keep Exercism welcoming. Please read and abide by the Code of Conduct.
Before contributing code to any existing exercise or any new exercise, please have a thorough look at the current exercises and dive into open issues.
There are two ways to implement new exercises (exercises that don't exist in this track).
- Pick one from the list of exercises (implemented in other tracks).
- Create a new, track-specific exercise from scratch.
Let's say you want to implement a new exercise, from the list of (practice) exercises, because you've noticed that this track could benefit from this exercise, really liked it in another track, or just because you find this interesting; the first step is to check for an open issue. If it's there, make sure no one is working on it, and most of all that there is not an open Pull Request towards this exercise.
If there is no such issue, you may open one. The baseline of work is as follows:
- Open a new issue, we'll label it with
new exercise ✨
- We'll assign the issue to you, so you get to work on this exercise
- Create a new folder in
/exercises
- You'll need to sync this folder with the matching config files.
You can use
sync
to do this:ASSIGNMENT=practice/slug corepack yarn sync
. - Create a
<slug>.ts
stub file. - Create a
<slug>.test.ts
test file. Here add the tests, per canonical data if possible (more on canonical data below). - Create a
.meta/proof.ci.ts
file. Place a working implementation, assuming it's renamed to<slug>.ts
- Create
.meta/tests.toml
. If the exercise that is being implemented has test data in the problem specifications repository, the contents of this file must be a list of UUIDs of the tests that are implemented or not implemented. Scroll down to tools to find the configlet application which aids generating this file interactively. - Create a
.meta/config.json
. Copy the structure from any other.meta/config.json
. Fill theblurb
,source
andsource_url
according to themetadata.yml
in the problem specifications repository. Add yourself as author. - Create a
.docs/instructions.md
file. Copy the instructions from the problem specifications repository - Run the tests locally, using
scripts/test
:ASSIGNMENT=practice/slug corepack yarn test
. - Run the linter locally, using
scripts/lint
:ASSIGNMENT=practice/slug corepack yarn lint
. - Create an entry in
config.json
: a unique new UUID (you can use theconfiglet uuid
tool to generate one, scroll down to tools to see how you can get it), give it a difficulty (should be similar to similar exercises), and make sure the order of the file is sane. Currently the file is ordered first on core - non core, then on difficulty low to high, and finally lexographically. - Format the files, using
scripts/format
:corepack yarn format
.
The final step is opening a Pull Request, with these items all checked off. Make sure the tests run and the linter is happy. It will run automatically on your PR.
The steps for a track-specific exercise are similar to those of implementing an established, existing exercise. The differences are:
- You'll have to write a README.md and test-suite from scratch
- You'll have to come up with a unique slug.
- We need to require an icon for it.
- Generate a UUID, for example using configlet.
Open a new issue with your proposal, and we'll make sure all these steps are correctly taken. Don't worry! You're not alone in this.
There are always improvements possible on existing exercises.
README.md
: the description that shows up on the student's exercise page, when they are ready to start.
It's also downloaded as part of the exercise's data.
he README.md
, together with the <slug>.test.ts
file form the contract for the implementation of the exercise.
No test should force a specific implementation, no README.md
explanation should give away a certain implementation.
The README.md
files are generated, which is explains here.
- This file may need to be regenerated in order to sync with the latest canonical data.
- You may contribute track specific
hints.md
, as listed in that document - You may improve the track specific
exercise-readme-insert.md
, and regenerate all the READMEs.
Note: In v3, this will no longer be exactly the same. We'll update this section of the guide when V3 is live.
Syncing an exercise with canonical data: There is a problem-specifications repository that holds test data in a standardised format. These tests are occasionally fixed, improved, added, removed or otherwise changed. Syncing an exercise consists of:
- updating
tests.toml
; - updating the
<slug>.test.ts
file; - updating the
.meta/tests.toml
file, if the exercise that is being updated has test data in the problem specifications repository. The contents of this file can be updated using configlet, interactively; - match the
proof.ci.ts
file to still work with the new tests; and - regenerate the
README.md
, should there be any changes.
Mentor notes are the notes that are given to the mentors to guide them with mentoring.
These notes do not live in this repository, but instead in the website-copy
repository.
Find their contributing guidelines here.
Some exercises already have automated mentoring support.
These automations don't live in this repository, but instead in the typescript-analyzer
repository.
Find their contributing guidelines here.
There is quite a bit of student-facing documentation, which can be found in the docs
folder.
You may improve these files by making the required changes and opening a new Pull Request.
You'll need LTS or higher NodeJS in order to contribute to the code in this respository. Run yarn
in the root in order to be able to run the scripts as listed below.
We use the following dependencies:
shelljs
in order to provide shell interface to scriptseslint
for linting all code in the stub, test file and example filejest
to run all the test files on all example implementationsbabel
to transpile everything so it works regardless of your version of NodeJS.
We also use prettier
to format the files. When possible, use corepack yarn format
to run prettier.
If you want to auto-format using your editor, match the version in the GitHub Workflow verify-code-formatting.yml
.
If you'd like to download configlet, you can use the fetch-configlet
binary.
It will run on Linux, Mac OSX and Windows, and download configlet
to your local drive.
Find more information about configlet here.
If a track implements an exercise for which test data exists, the exercise must contain a
.meta/tests.toml
file. The goal of thetests.toml
file is to keep track of which tests are implemented by the exercise. Tests in this file are identified by their UUID and each test has a boolean value that indicates if it is implemented by that exercise.
A tests.toml
file for a track's two-fer
exercise looks like this:
[canonical-tests]
[19709124-b82e-4e86-a722-9e5c5ebf3952]
description = "no name given"
include = true
[3451eebd-123f-4256-b667-7b109affce32]
description = "a name given"
include = true
[653611c6-be9f-4935-ab42-978e25fe9a10]
description = "another name given"
include = false
To make it easy to keep the tests.toml
files up to date, contributors can use the configlet
application's sync
command.
This command will compare the tests specified in the tests.toml
files against the tests that are defined in the exercise's canonical data.
It then interactively gives the maintainer the option to include or exclude test cases that are currently missing, updating the tests.toml
file accordingly.
We have various scripts
for you in order to aid with maintaining and contributing to this repository.
This has been aliased as a top-level package.json
scripts
script:corepack yarn format
/*
* Run this script (from root directory):
*
* $ corepack yarn format
*
* This runs `prettier` on all applicable files, FORCES using the same version
* as the CI uses to check if the files have been formatted.
*/
Use this action to format all the files using the correct version of prettier.
If you want your editor to do this automatically, make sure you install corepack yarn install
.
This has been aliased as a top-level package.json
scripts
script:corepack yarn lint
/*
* Run this script (from root directory):
*
* $ corepack yarn lint
*
* This runs `eslint` on all sample solutions (and test) files
*/
If the ASSIGNMENT
environment variable is set, only that exercise is tested.
For example, if you only want to lint two-fer
, you may, depending on your environment use:
ASSIGNMENT=practice/two-fer corepack yarn lint
This has been aliased as a top-level package.json
scripts
script:corepack yarn test
/**
* Run this script (from root directory):
*
* $ corepack yarn test
*
* This runs `jest` tests for all sample solutions
*/
If the ASSIGNMENT
environment variable is set, only that exercise is tested.
For example, if you only want to test the example.js
for two-fer
, you may, depending on your environment, use:
ASSIGNMENT=practice/two-fer corepack yarn test
This has been aliased as a top-level package.json
scripts
script:corepack yarn sync
/**
* Run this script (from root directory):
*
* $ corepack yarn sync
*
* This script is used to propagate any change to root package.json to
* all exercises and keep them in sync.
* There is a CI step which checks that package.json in root & exercises match
* (see checksum script for more info).
*/
If the ASSIGNMENT
environment variable is set, only that exercise is tested.
For example, if you only want to sync the files for two-fer
, you may, depending on your environment, use:
ASSIGNMENT=practice/two-fer corepack yarn sync
/*
* Run this script (from root directory):
*
* $ corepack yarn node scripts/checksum.mjs
*
* This will check root `package.json` matches each exercise's `package.json`.
* But the catch is there are some dependencies used for build but not served to end users
* We skip those dependencies while performing checksum.
* See `SKIP_PACKAGES_FOR_CHECKSUM` in helpers.js for list of skipped packages.
*/
This has been aliased as a top-level package.json
scripts
script:corepack yarn ci:check
/**
* Run this script (from root directory):
*
* $ corepack yarn ci:check
*
* This will run following checks:
*
* 1. Check config in all exercises matches
* 2. Checks stubs exist
* 3. Run eslint to check code-style
*/
Run this script to check stubs, configuration integrity and lint the code.
This has been aliased as a top-level package.json
scripts
script:corepack yarn ci
This script is almost the same as test
. You may use them interchangeably at moment of writing
/**
* Run this script (from root directory):
*
* $ corepack yarn ci
*
* This will run following checks:
*
* 1. Find the exercises
* 2. Run tests against sample solutions
*/
Run this script to test all exercises.
/**
* Run this script (from root directory):
*
* $ corepack yarn node scripts/name-check.mjs
*
* This will run following checks:
*
* 1. Package name is of the format "@exercism/typescript-<exercise>"
*
* This script also allows fixing these names:
*
* $ corepack yarn node scripts/name-check.mjs --fix
*/
Run this script to check if package name in package.json of exercises is in expected format.
Run this script with the --fix
flag to automatically fix the names.
/**
* Run this script (from root directory):
*
* $ corepack yarn node scripts/name-uniq.mjs
*
* This will run following checks:
*
* 1. All exercises have unique package names in their package.json files.
*/
Run this script to check if there is any duplicate package name.