TAL comes complete with an extensive set of unit tests.
The tests themselves are located in
The tests mimic the file structure of TAL, with one test file per implementation file.
Ensure you have
npm available on your system by installing Node.js.
Then ensure that grunt-cli is installed:
npm install grunt-cli -g
Automatically install the required Node packages to the root of your TAL working copy. Change to the root of the working copy, then:
You now have the choice of running the tests in the console, or in a browser window.
Tests can be run in the console, using PhantomJS (installed as part of
npm install), with the following command:
The following command will generate SpecRunner.html and open it in your default browser, running the tests in the Jasmine UI:
Note that SpecRunner.html will persist after
grunt spec, whereas it will be deleted after
grunt test. Because it is a generated file, it should not be committed to the repository.
Running the tests in a browser window is useful for debugging, while running them in the console is useful for CI jobs and for sanity checking.
You can run a subset of the tests by applying a filename filter to either
grunt spec or
grunt test. For example:
grunt test --filter=carousel
Most TAL tests are defined within JsTestDriver AsyncTestCase instances. Most use sinon.js.
Newer TAL tests, particularly ones that can run synchronously, are written directly in Jasmine. This is the desired future direction of the framework.
The JsTestDriver adaption layer we use to run old JsTestDriver tests under Jasmine is located at:
Hopefully you will not need to investigate it, but it’s useful to know it’s there.
We are currently on Jasmine 1.3 rather than the more recent Jasmine 2.0. This is because the adaption layer is written against Jasmine 1.3 and utilises some of its internals. We hope to update to Jasmine 2.0 at some point in the future.
These are helper methods for loading in framework modules under test and ensure they are unloaded in teardown. The methods should be used as follows:
queuedRequire()if the module under test is isolated and does not require an initialised application context (directly or indirectly)
queuedApplicationInit()if the module under test needs an application context
We create a Sinon sandbox in the test case’s
setUp() method and call
tearDown(). You should access Sinon’s methods through the sandbox to ensure any stubs/spies/mocks are removed post test.
If you wish to use Sinon’s assertions and have JsTestDriver’s
expectAsserts() method include them in its assertion count, you need to define the
You can stub out methods on a dependencies by loading the dependency via the queued functions, then stubbing the prototype of the dependency before instantiating the dependent class.
As require only loads each module once, the dependent module gets the stubbed method.
For the TAL server side NodeJS code, using nodeunit:
1. Install nodeunit: https://github.com/caolan/nodeunit
npm install -g nodeunit
2. Run the tests:
cd node-test nodeunit .
Expected sample output:
antieframeworktest ✔ Generic TV1 Device has no Headers ✔ Generic TV1 Device has no body ✔ Generic TV1 Device has default Mime type ✔ Generic TV1 Device has default Root element ✔ Generic TV1 Device has default Doc type ✔ Device has expected header ✔ Device has expected body ✔ Device has expected Mime type ✔ Device has expected Root element ✔ Device has expected Doc type ✔ Normalise key names replaces special characters with underscores ✔ Normalise key names replaces upper case to lower case ✔ Get generic device config ✔ Get generic app config ✔ Get generic app config (Alt) ✔ App config overrides device config when merged OK: 16 assertions (23ms)