Home » Automatic testing

Automatic testing

You’ve heard of automated testing, like unit tests. This is automatic testing.

Computers are here to make our lives easier. Many tedious tasks that people once had to do by hand are now automatic. If you’re a developer you’ve probably had to write a lot of different tests. I’m afraid that’s not going to go away, yet. But there are still some things that a computer can do for you that might save a developer time. Why not build a program that will do testing for us, without having to write a script.

There’s no program that can do it all, so straight away you’re going to need to make some assumptions about the kind of thing you want to test. Let’s think about testing a website you’re building. This makes things easier because URLs are useful to identify what we’re currently looking at. To start with it would be reasonable to run some tests on all the URL pages in your website.

In the age of GDPR it’s gotten harder to get real quality data to fill out our web pages and get good examples for testing. Even if you can’t get live data, hopefully you can set up an anonymised or demo example of your website.

Crawling

If you’re lucky, maybe there’s a file with all the URLs on your website somewhere in code, but even if that’s the case consider the URL:
https://www.example.com/database/table/id/4004
You probably can’t rely on the 4004th record always being there and similarly you may not want to test the same page 4004 times. So instead we can try making a web crawler. A bot that will search your website for URLs.

There are lots of options for how to write this. Finding something that will run the JavaScript on your pages is useful, like Selenium.

Bot steps
  • Start on a landing page, a home page or maybe a login page
  • Do a login if needed
  • Get all links or URLs that appear on the page
  • Forget links to places not in your website, don’t test the whole internet for bugs!
  • Forget links we’ve already seen, you may want to consider /database/table/id/4003 and /database/table/id/4004 to be the same page if you can. Try not to go infinite
  • Open the next page in the list and start over
  • Stop when you’ve checked all known pages for new links

That’s nice and simple but here are some headaches you may experience when you try this:

  • If your page does some delayed loading with JavaScript, you might miss links that aren’t loaded in time
  • If a page is only opened with a link in a pop up, button or another JavaScript action, it won’t be found
  • Sometimes a section of the page will only appear with the correct data, if the 1 example we looked at doesn’t have that section we might miss links from it
  • Pages don’t always look exactly the same every time you open them. For example if the data shown is based on the current time of day
  • Pages might redirect the bot. Just because you tried to open a URL, that doesn’t mean that’s the URL of the page you’ll end up on
  • Pages sometimes require a specific URL query, cookie or previous page to function normally

It will depend on common practices of everyone working on the website as to what you need to worry about and what can be done to get around it.

Next steps

So you’ve automatically got a list of pages. What can you do with them? This is where it gets difficult to say things generically. Some easy suggestions are:

  • If you have a display page for errors, you could save the contents every time you find it and perhaps send out a signal flare to the developers
  • Record every time the server gave an unexpected server response code, like a 404, what URL and when. Trigger a siren near the developers
  • Try some actions on the page. Have the bot write in text fields, change drop down lists or click buttons. This can be random or based on common patterns specific to your website e.g. an input with the label Note will accept just a “-“. This rabbit hole is very deep and very dark. You can go a long way teaching a bot to be human.

The pop up problem

You open a pop up in a page, the current page gets darker and a smaller box expands from nothing, presenting you with some options. You, a human, know you need to interact with things in the white box to make it go away. A bot doesn’t. The bot still sees the entire page and the pop up. It probably doesn’t understand z-index or opacity. It may try changing things in the greyed out part of the page for quite a while before trying something in the pop up. Even if it finds an error, you’re probably not worried about users interacting with impossible parts of the page through the JavaScript console. OK, well, maybe you and your colleagues always use a Bootstrap Modal, you could write an exception so the bot knows how to deal with that exact HTML. But that doesn’t stop someone doing something slightly different. This is far from the only situation where bots will stumble. Hopefully everyone will try and re-use or abstract code to make things easier but it’s not something you can always depend on.

Try Lighthouse

Lighthouse is an open-source, automated tool for improving the performance, quality, and correctness of your web apps

Given a page, Lighthouse can be configured to run tests on performance, accessibility and more. It will generate a report detailing suggestions for how to fix what it finds. It will also give a page a score out of 100. So instead of just looking for error messages we can test pages are quick and conform to accessibility standards. We might choose to have a spinning red light turn on next to the developers if this score gets worse or is below some threshold.

Have the computer do the work

This might all sound like a lot of work to start, but imagine you or another developer adds or edits a page. Now without doing any more work or having to remember to do something extra, the page is checked for obvious errors, even tested for accessibility and performance standards and perhaps interacted with by a bot taught to act human. Just try to remember you started doing this to make less work.

Share this article

Email
Twitter
LinkedIn
WhatsApp

© 2022 Tracsis plc