speeding up phpunit

Speeding up PHPUnit tests 15 times

Testing is still something I feel I need to almost force myself to do, and what previously made it worse was that my tests took so long to run. While working on elev.io development for to create a better way for you to support your users (seriously, check it out and support what I do), while we only had 50% code coverage, those tests already took over 35 seconds to complete.

It was enough for me to no longer bother consistently running tests, which almost defeats the purpose of self testing code. Until I got fed up enough to spend the time on speeding up the tests, by fifteen times (and almost halving memory usage).

Here’s what I did.

This post is somewhat heavy on laravel, however the end goal is applicable for any PHP unit testing

Why was it so slow to begin with?

First I’ll clarify why my tests were so slow, for all of my acceptance tests I wanted the full page to render, which meant that I couldn’t sanely mock every possible item that needed to be pulled from the database. So, for each and every test that needed data from the database, I needed to setup and seed the database. The project I was working on was using laravel, so I was doing something like this:

public function setupDatabase()
    $this->migrated = true;

And I would call this function before each of the tests that needed a database (every test needs to be run on a fresh, known database state that hasn’t been tainted by other tests). Just this migration and seeding process took over a second each time, which when you start building up a large test suite really starts to add up. 

The initial test database setup

Like a lot of people, I was putting the database in memory rather having to rely on creating an actual database on disk with mysql. I’d seen this technique being used and recommended quite a lot, so figured it was just the done thing so put up with it for a while.

The new test database setup

After getting fed up enough to find a faster way to setup and destroy the database for each and every test that needed it, I started to look at alternate database types. And one stood out, big time. sqlite being a flat file meant that I could simply have a prebuilt and seeded database I could just copy that file and run a test on it then trash the file. Then just rinse and repeat for each test I needed a database for.

This way, instead of having to migrate a whole database, then fill it with data, all I needed to do was copy a single sqlite file.

Since I was using laravel, I setup my app/testing/database.php file with the following:

return array(
    'default' => 'testing',

    'connections' => array(

        'setup' => array(
            'driver' => 'sqlite',
            'database' => __DIR__.'/../../database/testing/stubdb.sqlite',
            'prefix' => '',

        'testing' => array(
            'driver' => 'sqlite',
            'database' => __DIR__.'/../../database/testing/testdb.sqlite',
            'prefix' => '',

        'sqlite' => array(
            'driver' => 'sqlite',
            'database' => ':memory:',
            'prefix' => ''

    'migrations' => 'migrations',


You’ll notice there’s two items that are using sqlite, this is so I can easily create a new stub file if my schema or seed data changes. Using laravels artisan tool, I can run the following to regenerate my stub database:

php artisan migrate:refresh --seed --database="setup" --env="testing"

NB the first time you setup your stub file, you’ll need to manually create the stubdb.sqlite file, and run a normal migration withouth the “refresh”

So now, when I run my tests, instead of calling the migration and seed commands, I simply copy the stub database file like this:

public function setupDatabase()
    exec('rm ' . __DIR__ . '/../database/testing/testdb.sqlite');
    exec('cp ' . __DIR__ . '/../database/testing/stubdb.sqlite ' . __DIR__ .     '/../database/testing/testdb.sqlite');

Bonus: Auto updating the seed database file

If you’re not using something like grunt or gulp already, get on it. Seriously. They are amazing for optimizing your workflow.

I’m using gulp since I like it’s format a lot more, it just seems a lot more readable and easier to work with.

Gulp comes in really handy by watching certain files for any updates, and acting according to rules that you setup. So you can tell it to check for any php files that are updated in your main apps folder, and automatically run your tests. Or you could tell it to watch for updates to any css / less files being updated, and compile and compress them into a single minified file immediately, and also reload the browser to show your changes. There’s so much you can do, but that’s for another post.

What I’ve also decided do with grunt, is to check for any updates to the laravel migration or seed files, and as soon as I make a change to those it will automatically call the above migration command to recreate the stub database file.

Testing just got a whole lot more accessible, and sane, for me.

Edit & final note

A few people have mention in the comments that this isn’t pure unit testing, and they are 100% correct. But I’m OK with that, myself and a lot of other people don’t care if the testing is ‘pure’, as long as we’re testing. Should I be more anal about testing, yes, of course. But for now, this approach keeps me testing.

  • Jon Lynch

    Thank you for sharing this. I too believed what I had read about setting up a sqlite database in memory. Your approach makes running tests that hit the database much less painful.

  • Thomas Clarkson

    One of the best tutorials I have read in a while. My tests are now blazingly quick :)

    Would you be able to post your grunt code for recreating the stub database file. Cheers

    • http://www.subooa.com.au Chris Duell


      I’m actually using gulp, not grunt. I can post that though if you’re interested?

      • Thomas Clarkson

        Even better, yes please :)

  • thepixeldeveloper

    This solution isn’t the proper one, sqlite might have different quirks to MySQL and Postgres, so you don’t know how your code will react in production.

    You shouldn’t care how data is stored, use mocking to ensure your classes react the way you hope they do. You’ll cut down the time your tests run into the 500 per second range. Proper dependency injection can really help here.

    • http://www.subooa.com.au Chris Duell

      I completely agree that where possible, you should be mocking what you can to not even touch the database since you should be testing only your code, not the db connection.

      But in acceptance testing, it’s near impossible to mock absolutely everything that gets shown to the user, particularly on a busy screen like a dashboard that is using a bunch of partials to pull in certain information (which is where me need for this initially came about).

      Whatever you’re using for your db connection should have it’s own unit tests anyway to make sure any quirks are either known or handled.

      • Pixelentophomy

        What are the best practices for testing queries that are not database agnostic?

        Using your busy dashboard screen example, say you’re using MySQL and have a simple query that uses CONCAT() to join the first and last name of a user. This query would fail to execute in SQLite.

        Are you supposed to avoid using those types of functions in queries, and instead marry the information together using PHP?

        Or, how about something like an UPDATE query that also has a JOIN? Something that SQLite does not support without re-writing it as a sub-query.

        I agree that switching to SQLIte for tests is the way to go for speed. However, I’ve run into some hurdles with my non-database agnostic queries that I’m not sure how to overcome, short of just re-writing them.

        Thanks for the great article and your feedback!

        • http://www.subooa.com.au Chris Duell

          A very valid point, and no doubt something I’ll come across in the coming weeks (at that point I’ll likely update this post or create a new one).

          In cases where you explicitly need MySQL functionality, you could perhaps instead call a setupDatabase($engine) sending the engine you want to use. e.g., mysql or sqlite. Then make some tweaks to make sure during that test case, the specific db engine is used.

          That way you can control if you want to use sqlite for the simple stuff to keep it super fast, and mysql for the tests that rely on certain MySQL only functionality.

  • http://dimsav.com/ Dimitrios Savvopoulos

    Your idea was genius! I tested it and the difference was huge. I believe this way of testing the db should be recommended by laravel docs in place of :memory:.

    Thanks a lot.

    • http://www.subooa.com.au Chris Duell

      Glad it helped you out!

      I’m playing about with Codeception at the moment using the same approach in the _bootstrap.php file to keep it snappy

  • http://jorhult.se Mikael Jorhult

    I was refactoring my test suite today and this approach helped me increase the speed of my tests a lot! Thank you for sharing!

  • Mike Hopley

    This is brilliant. It took my test suite down from 1.63 minutes to 58 seconds.

    But that’s not the whole story, because most of that time is spent on slow tests calling external APIs — and I rarely revisit those classes. The big improvement is where I have a class with lots of database tests, which get run on save. This sort of test drops from maybe 5 seconds down to 300 milliseconds. That’s a huge “quality of life” improvement in my workflow.

    A couple of suggestions:

    The ‘rm’ and ‘cp’ commands are not cross-platform (i.e. it didn’t work on Windows for me). You can use Laravel’s File class. For example, this is the method in my TestCase:

    protected function setupDatabase()
    $path = app_path() . ‘/tests/_data/sqlite/';

    File::copy( $path . ‘dump.sqlite’, $path . ‘active.sqlite’ );

    On that note, I like putting this in my TestCase.php file. Then I can just call $this->setupDatabase() in the setup() method of the test.

    Anyway, thank you very much. This made a big difference and was really easy to set up. :)

    • http://chrisduell.com/ Chris Duell

      oh awesome results!

      good suggestion on the File::copy usage, makes sense

      • Mike Hopley

        I’ve also created a command called db:refresh, which wraps up some things:

        1. Migrate and seed the main DB.
        2. Refresh the sqlite dump
        3. Refresh a mysql dump for Codeception acceptance tests

        Thus I call db:refresh and all my DB stuff is updated at once.

        To do this, I moved the “setup” DB connection into the main (production) config, and renamed it to something more “specific” (I called it functional_dump). This is necessary because Artisan::call() ignores environments.

        Then you can call:

        Artisan::call( ‘migrate’, [
        ‘–seed’ => true,
        ‘–database’ => ‘functional_dump’,

      • nathan32

        Then how about you update your tutorial?

        • http://chrisduell.com/ Chris Duell

          Because while this artilcle is most laravel based, this copying of a stub database is the crux of the solution and I didn’t want to ostracise people not using laravel

  • Mohamed Said

    I really can’t thank you more. This approach is very handy.

    • http://chrisduell.com/ Chris Duell

      hmmm very strange, perhaps add a slight timeout before the process of running a new test just to check that hypothesis. I’m not sure, sorry.

      • Mohamed Said

        I found that when I onle use the “cp” command it overwrites the file with the setup.sqlite with no problems. Not sure why the remove command did not work, however thank you so much for such amazing idea.

  • http://www.socialengine.com/ Stepan Mazurov

    Hey, thanks for this, it made our tests go from 3.x minutes to 50s.

    I wrapped it up as a laravel package: https://github.com/socialengine/setup-test-db in case people want to save a bit of time and keep things composered up.

    • http://chrisduell.com/ Chris Duell

      oh nice work, don’t know why I didn’t think to do that myself!

  • kurashu89

    This strikes me as speeding up integration tests, rather than unit tests. Unless you’re writing a database driver or an ORM, why are your unit tests relying the database rather than just data objects?

    • http://chrisduell.com/ Chris Duell

      Because I’m not anal about my testing.
      This article isn’t for testing pro’s, but more for people getting in to testing, and removing a barrier for them which is speed when they do use a database (it’s worth noting that not *all* our tests use the db, and so in those cases it’s not fired up)

      • kurashu89

        If it works for you that’s great. I still stand by that it’s not really a unit test though. Testing against the database — especially if an ORM is involved, gotta make sure those models are right — is important.

        I do find the advice to test and develop against a different database than what you’re using in production, but again if it works for you.

        Don’t take this as an attack or harsh criticism on your testing, development and deployment cycles. Just sharing thoughts.

        • http://chrisduell.com/ Chris Duell

          Not taking it personal at all, the reason this post was published was to spark up conversation like this. People more proficient at testing giving advice to those like myself who still have a lot to learn, this was a post purely to keep people in the testing game and not getting fed up with speed and bailing on tests (whether they are doing it ‘right’ or not).

          Thanks for chiming in with your experience

  • Sobit

    Not sure why do you call your tests “unit” if they are operating with database. If you really have to test your functionality against DB, you should better consider your tests to be split up for unit tests suite and integration/functional tests suite.

  • nickolaich

    Nice approach. Thank you. I got speed from 1.34sec up to 33sec for one of the tests. Actually, sqlite makes troubles with migrations (altering of columns with type of enum etc, UNIX_TIMESTAMP etc). For current stage of development it’s very good results for us.

    I have improved it a little:
    1. Read db’s configuration instead of hardcoding, so you can edit only config files:
    exec(sprintf(‘rm %s’, Config::get(‘database.connections.testing.database’)));
    exec(sprintf(‘cp %s %s’, Config::get(‘database.connections.setup.database’), Config::get(‘database.connections.testing.database’)));

    2. Added selective seeding. By default it seeds whole db, for some tests I only need 1-2 tables.

    if (is_array($this->_seedList)){
    foreach($this->_seedList as $seedClass){
    } else {

    3. We can pre-seed some of stable db tables to the “setup” db.

  • Richard Robinson

    Pure genius! I was about 25% into my TDD and was already excluding earlier tests in the interest of saving time. Obviously not a good thing to do! I haven’t seen a 15x reduction in time, but is running about 4 times as fast. Thank you.

  • JonoB

    Can I ask – how do you seed your sql database if your have foreign key relationships in your migrations?

    • http://chrisduell.com/ Chris Duell

      using laravels seeder, and building up the content in its logical order

  • Keith Humm

    Have you considered trying a copy from the file to an in-memory database for each test? Might be a bit quicker still?

  • Eric Willhoit

    If anyone else is getting:

    `PDOException: SQLSTATE[HY000]: General error: 8 attempt to write a readonly database`

    I deleted the `rm` command from the `setupDatabase` function and it works fine. The `cp` overwrites the `testdb.sqlite` without the `rm`

  • http://www.lollypop.gr/ George Drakakis

    Actually there is a technical way to run mysql in memory. The way this works is to mount a portion of RAM and store mysql data tables there. I saw a huge difference doing this and I think its better than running another type of DB for tests.