Make Robust Basic Engine Robots
Websites often change without notice. Robustness is the term used to describe how well robots cope with website changes. The more changes the robot can deal with and still work correctly, the more robust it is.
Making robust robots involves analyzing the website in question and understanding how it responds in various situations, such as when a registration form is filled out incorrectly. In a sense, writing robust robots involves a kind of reverse engineering of the website logic, and usually the only way to do this is through exploration.
The two different approaches to robustness each serves a different purpose:
- Succeed as much as possible.
- Fail if not perfect.
Succeeding as much as possible might, for a robot extracting news type variables, mean that it should extract as many news items as possible. In Basic Engine Robots, you use conditional actions, Try steps, and data converters to deal with different layouts, missing information, and unusually formatted content.
Failing when things are not perfect might, for an order submission robot, mean that it should fail immediately if it cannot figure out how to enter a field correctly, or the order result page does not match an exact layout. In this sense, failing does not mean to generate an API exception. Instead, it means that the robot should return a value dedicated to describing errors and failure causes. Robots taking input variables often fail, rather than succeed as much as possible. In Basic Engine Robots, you can use dedicated error type variables, error handling, and conditional actions to detect and handle unexpected situations.
For more information on Design Studio techniques to make robots more robust, consult the following sections: