Basic Engine Robots
This section describes the different tasks involved in creating Basic Engine Robots and the types used by these robots.
Basic Engine Robots were initially designed to automate stateless websites and applications where the state resides internally in the robot. Most robots can be divided into two parts: a navigation part and an extraction part.
Navigation is concerned with "getting to where the content is." Navigation mainly includes loading pages and submitting forms. When navigating in Design Studio, you typically use the Click action to navigate through and among web pages.
Extraction is concerned with "getting the right content." Extraction mainly includes selecting, copying, and normalizing content. When extracting in Design Studio, you typically use the Test Tag action to skip uninteresting ("noisy") content, the Extract action to copy content into variables, and the data converters for normalizing the content so that it gets the format you want, such as the right date and number format. Once extracted, you output the value with the Store in Database or Return Value action.
Steps help you log in to applications, extract data from web pages, enter data into forms or search boxes, make menu selections, and scroll through multiple pages. Your robot can also access databases, files, APIs, web services, and other robots, exporting data from one application and loading it into another; transforming data as necessary along the way.
Most robots include other actions than the ones mentioned above, such as a For Each Tag action for loading several similar looking pages or extracting values from several similar looking table rows.